Search results

1 – 10 of 916
Article
Publication date: 24 November 2022

Jens K. Perret

The purpose of the study is to fill a gap in the literature on mathematical production planning (joint balancing and sequencing) in the fashion industry. It considers in…

Abstract

Purpose

The purpose of the study is to fill a gap in the literature on mathematical production planning (joint balancing and sequencing) in the fashion industry. It considers in particular situations of mass customization, made-to-measure or small lot sizes.

Design/methodology/approach

The paper develops a mathematical model based on product options and attributes instead of fixed variants. It proposes an easy-to-use genetic algorithm to solve the resulting optimization problem. Functionality and performance of the algorithm are illustrated via a computational study.

Findings

An easy-to-implement, yet efficient algorithm to solve the multi-objective implementation of a problem structure that becomes increasingly relevant in the fashion industry is proposed. Implementation of the algorithm revealed that the algorithm is ideally suited to generate significant savings and that these savings are impervious to problem and thus company size.

Practical implications

The solutions from the algorithm (Pareto-efficient frontier) offer decision-makers more flexibility in selecting those solutions they deem most fitting for their situation. The computational study illustrates the significant monetary savings possible by implementing the proposed algorithm to practical situations.

Originality/value

In contrast to existing papers, for the first time, to the best of the author’s knowledge, the focus of the joint balancing and sequencing approach has been applied in the fashion instead of the automotive industry. The applicability of the approach to specific fields of the fashion industry is discussed. An option and attributes-based model, rarely used in general assembly line sequencing per se, is used for more flexibility in representing a diverse set of model types.

Details

Research Journal of Textile and Apparel, vol. 27 no. 3
Type: Research Article
ISSN: 1560-6074

Keywords

Open Access
Article
Publication date: 28 April 2020

Alessandro Tufano, Riccardo Accorsi and Riccardo Manzini

This paper addresses the trade-off between asset investment and food safety in the design of a food catering production plant. It analyses the relationship between the quality…

1361

Abstract

Purpose

This paper addresses the trade-off between asset investment and food safety in the design of a food catering production plant. It analyses the relationship between the quality decay of cook-warm products, the logistics of the processes and the economic investment in production machines.

Design/methodology/approach

A weekly cook-warm production plan has been monitored on-field using temperature sensors to estimate the quality decay profile of each product. A multi-objective optimisation model is proposed to (1) minimise the number of resources necessary to perform cooking and packing operations or (2) to maximise the food quality of the products. A metaheuristic simulated annealing algorithm is introduced to solve the model and to identify the Pareto frontier of the problem.

Findings

The packaging buffers are identified as the bottleneck of the processes. The outcome of the algorithms highlights that a small investment to design bigger buffers results in a significant increase in the quality with a smaller food loss.

Practical implications

This study models the production tasks of a food catering facility to evaluate their criticality from a food safety perspective. It investigates the tradeoff between the investment cost of resources processing critical tasks and food safety of finished products.

Social implications

The methodology applies to the design of cook-warm production. Catering companies use cook-warm production to serve school, hospitals and companies. For this reason, the application of this methodology leads to the improvement of the quality of daily meals for a large number of people.

Originality/value

The paper introduces a new multi-objective function (asset investment vs food quality) proposing an original metaheuristic to address this tradeoff in the food catering industry. Also, the methodology is applied and validated in the design of a new food production facility.

Details

British Food Journal, vol. 122 no. 7
Type: Research Article
ISSN: 0007-070X

Keywords

Abstract

Details

Acceptability of Transport Pricing Strategies
Type: Book
ISBN: 978-0-08-044199-3

Article
Publication date: 4 October 2011

A. Mazeika Bilbao, A.L. Carrano, M. Hewitt and B.K. Thorn

This paper seeks to frame and model the environmental issues and impacts associated with the management of pallets throughout the entire life cycle, from materials to…

2034

Abstract

Purpose

This paper seeks to frame and model the environmental issues and impacts associated with the management of pallets throughout the entire life cycle, from materials to manufacturing, use, transportation to end‐of‐life disposal.

Design/methodology/approach

A linear minimum cost multi‐commodity network flow problem is developed to make pallet‐related decisions based on both environmental and economic considerations.

Findings

This paper presents a review of the environmental impacts associated with pallets by life cycle stage. The types of materials used to fabricate pallets, the methods by which they are treated for specific applications, and various pallet management models are described with respect to embodied energies, toxicity and emissions. The need for companies to understand the cost, durability, and environmental impact tradeoffs presented by pallet choices is highlighted. The paper introduces a model to assist in choosing both how pallets are managed and the material they are constructed of that balances these tradeoffs.

Originality/value

There is limited research on the environmental impact of different management approaches of large‐scale pallet operations. The proposed model and approach will provide companies seeking to engage in more sustainable practices in their supply chains and distribution with insights and a decision‐making tool not previously available.

Details

Management Research Review, vol. 34 no. 11
Type: Research Article
ISSN: 2040-8269

Keywords

Content available
Book part
Publication date: 30 March 2006

Abstract

Details

Structural Models of Wage and Employment Dynamics
Type: Book
ISBN: 978-0-44452-089-0

Article
Publication date: 1 January 2002

Michael Watkins and Susan Rosegrant

Much of the negotiation literature involves two parties that are each assumed to behave in a unitary manner, although a growing body of knowledge considers more complex…

1056

Abstract

Much of the negotiation literature involves two parties that are each assumed to behave in a unitary manner, although a growing body of knowledge considers more complex negotiations. Examples of the latter include two parties where one or both parties do not behave in a unitary manner, multiple parties on one or both sides, parties on multiple sides and parties engaged in separate but linked negotiations. Greater degrees of complexity distinguish these negotiations from negotiations with two unitary parties.

Details

International Journal of Conflict Management, vol. 13 no. 1
Type: Research Article
ISSN: 1044-4068

Article
Publication date: 16 April 2018

Qi Zhou, Xinyu Shao, Ping Jiang, Tingli Xie, Jiexiang Hu, Leshi Shu, Longchao Cao and Zhongmei Gao

Engineering system design and optimization problems are usually multi-objective and constrained and have uncertainties in the inputs. These uncertainties might significantly…

Abstract

Purpose

Engineering system design and optimization problems are usually multi-objective and constrained and have uncertainties in the inputs. These uncertainties might significantly degrade the overall performance of engineering systems and change the feasibility of the obtained solutions. This paper aims to propose a multi-objective robust optimization approach based on Kriging metamodel (K-MORO) to obtain the robust Pareto set under the interval uncertainty.

Design/methodology/approach

In K-MORO, the nested optimization structure is reduced into a single loop optimization structure to ease the computational burden. Considering the interpolation uncertainty from the Kriging metamodel may affect the robustness of the Pareto optima, an objective switching and sequential updating strategy is introduced in K-MORO to determine (1) whether the robust analysis or the Kriging metamodel should be used to evaluate the robustness of design alternatives, and (2) which design alternatives are selected to improve the prediction accuracy of the Kriging metamodel during the robust optimization process.

Findings

Five numerical and engineering cases are used to demonstrate the applicability of the proposed approach. The results illustrate that K-MORO is able to obtain robust Pareto frontier, while significantly reducing computational cost.

Practical implications

The proposed approach exhibits great capability for practical engineering design optimization problems that are multi-objective and constrained and have uncertainties.

Originality/value

A K-MORO approach is proposed, which can obtain the robust Pareto set under the interval uncertainty and ease the computational burden of the robust optimization process.

Details

Engineering Computations, vol. 35 no. 2
Type: Research Article
ISSN: 0264-4401

Keywords

Open Access
Article
Publication date: 2 December 2016

Juan Aparicio

The purpose of this paper is to provide an outline of the major contributions in the literature on the determination of the least distance in data envelopment analysis (DEA). The…

2216

Abstract

Purpose

The purpose of this paper is to provide an outline of the major contributions in the literature on the determination of the least distance in data envelopment analysis (DEA). The focus herein is primarily on methodological developments. Specifically, attention is mainly paid to modeling aspects, computational features, the satisfaction of properties and duality. Finally, some promising avenues of future research on this topic are stated.

Design/methodology/approach

DEA is a methodology based on mathematical programming for the assessment of relative efficiency of a set of decision-making units (DMUs) that use several inputs to produce several outputs. DEA is classified in the literature as a non-parametric method because it does not assume a particular functional form for the underlying production function and presents, in this sense, some outstanding properties: the efficiency of firms may be evaluated independently on the market prices of the inputs used and outputs produced; it may be easily used with multiple inputs and outputs; a single score of efficiency for each assessed organization is obtained; this technique ranks organizations based on relative efficiency; and finally, it yields benchmarking information. DEA models provide both benchmarking information and efficiency scores for each of the evaluated units when it is applied to a dataset of observations and variables (inputs and outputs). Without a doubt, this benchmarking information gives DEA a distinct advantage over other efficiency methodologies, such as stochastic frontier analysis (SFA). Technical inefficiency is typically measured in DEA as the distance between the observed unit and a “benchmarking” target on the estimated piece-wise linear efficient frontier. The choice of this target is critical for assessing the potential performance of each DMU in the sample, as well as for providing information on how to increase its performance. However, traditional DEA models yield targets that are determined by the “furthest” efficient projection to the evaluated DMU. The projected point on the efficient frontier obtained as such may not be a representative projection for the judged unit, and consequently, some authors in the literature have suggested determining closest targets instead. The general argument behind this idea is that closer targets suggest directions of enhancement for the inputs and outputs of the inefficient units that may lead them to the efficiency with less effort. Indeed, authors like Aparicio et al. (2007) have shown, in an application on airlines, that it is possible to find substantial differences between the targets provided by applying the criterion used by the traditional DEA models, and those obtained when the criterion of closeness is utilized for determining projection points on the efficient frontier. The determination of closest targets is connected to the calculation of the least distance from the evaluated unit to the efficient frontier of the reference technology. In fact, the former is usually computed through solving mathematical programming models associated with minimizing some type of distance (e.g. Euclidean). In this particular respect, the main contribution in the literature is the paper by Briec (1998) on Hölder distance functions, where formally technical inefficiency to the “weakly” efficient frontier is defined through mathematical distances.

Findings

All the interesting features of the determination of closest targets from a benchmarking point of view have generated, in recent times, the increasing interest of researchers in the calculation of the least distance to evaluate technical inefficiency (Aparicio et al., 2014a). So, in this paper, we present a general classification of published contributions, mainly from a methodological perspective, and additionally, we indicate avenues for further research on this topic. The approaches that we cite in this paper differ in the way that the idea of similarity is made operative. Similarity is, in this sense, implemented as the closeness between the values of the inputs and/or outputs of the assessed units and those of the obtained projections on the frontier of the reference production possibility set. Similarity may be measured through multiple distances and efficiency measures. In turn, the aim is to globally minimize DEA model slacks to determine the closest efficient targets. However, as we will show later in the text, minimizing a mathematical distance in DEA is not an easy task, as it is equivalent to minimizing the distance to the complement of a polyhedral set, which is not a convex set. This complexity will justify the existence of different alternatives for solving these types of models.

Originality/value

As we are aware, this is the first survey in this topic.

Details

Journal of Centrum Cathedra, vol. 9 no. 2
Type: Research Article
ISSN: 1851-6599

Keywords

Article
Publication date: 11 June 2018

Antonis Pavlou, Michalis Doumpos and Constantin Zopounidis

The optimization of investment portfolios is a topic of major importance in financial decision making, with many relevant models available in the relevant literature. The purpose…

Abstract

Purpose

The optimization of investment portfolios is a topic of major importance in financial decision making, with many relevant models available in the relevant literature. The purpose of this paper is to perform a thorough comparative assessment of different bi-objective models as well as multi-objective one, in terms of the performance and robustness of the whole set of Pareto optimal portfolios.

Design/methodology/approach

In this study, three bi-objective models are considered (mean-variance (MV), mean absolute deviation, conditional value-at-risk (CVaR)), as well as a multi-objective model. An extensive comparison is performed using data from the Standard and Poor’s 500 index, over the period 2005–2016, through a rolling-window testing scheme. The results are analyzed using novel performance indicators representing the deviations between historical (estimated) efficient frontiers, actual out-of-sample efficient frontiers and realized out-of-sample portfolio results.

Findings

The obtained results indicate that the well-known MV model provides quite robust results compared to other bi-objective optimization models. On the other hand, the CVaR model appears to be the least robust model. The multi-objective approach offers results which are well balanced and quite competitive against simpler bi-objective models, in terms of out-of-sample performance.

Originality/value

This is the first comparative study of portfolio optimization models that examines the performance of the whole set of efficient portfolios, proposing analytical ways to assess their stability and robustness over time. Moreover, an extensive out-of-sample testing of a multi-objective portfolio optimization model is performed, through a rolling-window scheme, in contrast static results in prior works. The insights derived from the obtained results could be used to design improved and more robust portfolio optimization models, focusing on a multi-objective setting.

Details

Management Decision, vol. 57 no. 2
Type: Research Article
ISSN: 0025-1747

Keywords

Article
Publication date: 11 November 2013

José Miguel Monzón-Verona, Santiago Garcia-Alonso, Javier Sosa and Juan A. Montiel-Nelson

The purpose of this paper is to explain in detail the optimization of the sensitivity versus the power consumption of a pressure microsensor using multi-objective genetic…

Abstract

Purpose

The purpose of this paper is to explain in detail the optimization of the sensitivity versus the power consumption of a pressure microsensor using multi-objective genetic algorithms.

Design/methodology/approach

The tradeoff between sensitivity and power consumption is analyzed and the Pareto frontier is identified by using NSGA-II, AMGA-II and ɛ-MOEA methods.

Findings

Comparison results demonstrate that NSGA-II provides optimal solutions over the entire design space for spread metric analysis, and AMGA-II is better for convergence metric analysis.

Originality/value

This paper provides a new multiobjective optimization tool for the designers of low power pressure microsensors.

Details

Engineering Computations, vol. 30 no. 8
Type: Research Article
ISSN: 0264-4401

Keywords

1 – 10 of 916