Search results

1 – 10 of over 2000
Content available

Abstract

Details

Kybernetes, vol. 41 no. 7/8
Type: Research Article
ISSN: 0368-492X

Content available
Article
Publication date: 15 June 2017

Ali Dadashi, Maxim A. Dulebenets, Mihalis M. Golias and Abdolreza Sheikholeslami

The paper aims to propose a new mathematical model for allocation and scheduling of vessels at multiple marine container terminals of the same port, considering the access channel…

1437

Abstract

Purpose

The paper aims to propose a new mathematical model for allocation and scheduling of vessels at multiple marine container terminals of the same port, considering the access channel depth variations by time of day.

Design/methodology/approach

This paper proposes a new mathematical model for allocation and scheduling of vessels at multiple marine container terminals of the same port, considering the access channel depth variations by time of day. The access channel serves as a gate for vessels entering or leaving the port. During low-depth tidal periods the vessels with deep drafts have to wait until the depth of the access channel reaches the required depth.

Findings

A number of numerical experiments are performed using the operational data collected from Port of Bandar Abbas (Iran). Results demonstrate that the suggested methodology is able to improve the existing port operations and significantly decrease delayed vessel departures.

Originality/value

The contribution of this study to the state of the art is a novel mathematical model for allocation and scheduling of vessels at multiple terminals of the same port, taking into consideration channel depth variations by time of day. To the best of the authors’ knowledge, this is the first continuous berth scheduling linear model that addresses the tidal effects on berth scheduling (both in terms of vessel arrival and departure at/from the berth) at multiple marine container terminals.

Details

Maritime Business Review, vol. 2 no. 2
Type: Research Article
ISSN: 2397-3757

Keywords

Open Access
Article
Publication date: 31 July 2023

Mohsen Anvari, Alireza Anvari and Omid Boyer

This paper aims to examine the integration of lateral transshipment and road vulnerability into the humanitarian relief chain in light of affected area priority to address…

599

Abstract

Purpose

This paper aims to examine the integration of lateral transshipment and road vulnerability into the humanitarian relief chain in light of affected area priority to address equitable distribution and assess the impact of various parameters on the total average inflated distance traveled per relief item.

Design/methodology/approach

After identifying comprehensive critical criteria and subcriteria, a hybrid multi-criteria decision-making framework was applied to obtain the demand points’ weight and ranking in a real-life earthquake scenario. Direct shipment and lateral transshipment models were then presented and compared. The developed mathematical models are formulated as mixed-integer programming models, considering facility location, inventory prepositioning, road vulnerability and quantity of lateral transshipment.

Findings

The study found that the use of prioritization criteria and subcriteria, in conjunction with lateral transshipment and road vulnerability, resulted in a more equitable distribution of relief items by reducing the total average inflated distance traveled per relief item.

Research limitations/implications

To the best of the authors’ knowledge, this study is one of the first research on equity in humanitarian response through prioritization of demand points. It also bridges the gap between two areas that are typically treated separately: multi-criteria decision-making and humanitarian logistics.

Practical implications

This is the first scholarly work in Shiraz focused on the equitable distribution system by prioritization of demand points and assigning relief items to them after the occurrence of a medium-scale earthquake scenario considering lateral transshipment in the upper echelon.

Originality/value

The paper clarifies how to prioritize demand points to promote equity in humanitarian logistics when the authors have faced multiple factors (i.e. location of relief distribution centers, inventory level, distance, lateral transshipment and road vulnerability) simultaneously.

Details

Journal of Humanitarian Logistics and Supply Chain Management, vol. 13 no. 4
Type: Research Article
ISSN: 2042-6747

Keywords

Open Access
Article
Publication date: 2 December 2016

Juan Aparicio

The purpose of this paper is to provide an outline of the major contributions in the literature on the determination of the least distance in data envelopment analysis (DEA). The…

2203

Abstract

Purpose

The purpose of this paper is to provide an outline of the major contributions in the literature on the determination of the least distance in data envelopment analysis (DEA). The focus herein is primarily on methodological developments. Specifically, attention is mainly paid to modeling aspects, computational features, the satisfaction of properties and duality. Finally, some promising avenues of future research on this topic are stated.

Design/methodology/approach

DEA is a methodology based on mathematical programming for the assessment of relative efficiency of a set of decision-making units (DMUs) that use several inputs to produce several outputs. DEA is classified in the literature as a non-parametric method because it does not assume a particular functional form for the underlying production function and presents, in this sense, some outstanding properties: the efficiency of firms may be evaluated independently on the market prices of the inputs used and outputs produced; it may be easily used with multiple inputs and outputs; a single score of efficiency for each assessed organization is obtained; this technique ranks organizations based on relative efficiency; and finally, it yields benchmarking information. DEA models provide both benchmarking information and efficiency scores for each of the evaluated units when it is applied to a dataset of observations and variables (inputs and outputs). Without a doubt, this benchmarking information gives DEA a distinct advantage over other efficiency methodologies, such as stochastic frontier analysis (SFA). Technical inefficiency is typically measured in DEA as the distance between the observed unit and a “benchmarking” target on the estimated piece-wise linear efficient frontier. The choice of this target is critical for assessing the potential performance of each DMU in the sample, as well as for providing information on how to increase its performance. However, traditional DEA models yield targets that are determined by the “furthest” efficient projection to the evaluated DMU. The projected point on the efficient frontier obtained as such may not be a representative projection for the judged unit, and consequently, some authors in the literature have suggested determining closest targets instead. The general argument behind this idea is that closer targets suggest directions of enhancement for the inputs and outputs of the inefficient units that may lead them to the efficiency with less effort. Indeed, authors like Aparicio et al. (2007) have shown, in an application on airlines, that it is possible to find substantial differences between the targets provided by applying the criterion used by the traditional DEA models, and those obtained when the criterion of closeness is utilized for determining projection points on the efficient frontier. The determination of closest targets is connected to the calculation of the least distance from the evaluated unit to the efficient frontier of the reference technology. In fact, the former is usually computed through solving mathematical programming models associated with minimizing some type of distance (e.g. Euclidean). In this particular respect, the main contribution in the literature is the paper by Briec (1998) on Hölder distance functions, where formally technical inefficiency to the “weakly” efficient frontier is defined through mathematical distances.

Findings

All the interesting features of the determination of closest targets from a benchmarking point of view have generated, in recent times, the increasing interest of researchers in the calculation of the least distance to evaluate technical inefficiency (Aparicio et al., 2014a). So, in this paper, we present a general classification of published contributions, mainly from a methodological perspective, and additionally, we indicate avenues for further research on this topic. The approaches that we cite in this paper differ in the way that the idea of similarity is made operative. Similarity is, in this sense, implemented as the closeness between the values of the inputs and/or outputs of the assessed units and those of the obtained projections on the frontier of the reference production possibility set. Similarity may be measured through multiple distances and efficiency measures. In turn, the aim is to globally minimize DEA model slacks to determine the closest efficient targets. However, as we will show later in the text, minimizing a mathematical distance in DEA is not an easy task, as it is equivalent to minimizing the distance to the complement of a polyhedral set, which is not a convex set. This complexity will justify the existence of different alternatives for solving these types of models.

Originality/value

As we are aware, this is the first survey in this topic.

Details

Journal of Centrum Cathedra, vol. 9 no. 2
Type: Research Article
ISSN: 1851-6599

Keywords

Content available
Book part
Publication date: 9 February 2004

Abstract

Details

Economic Complexity
Type: Book
ISBN: 978-0-44451-433-2

Content available
Article
Publication date: 1 March 2006

W.R. Howard

199

Abstract

Details

Kybernetes, vol. 35 no. 3/4
Type: Research Article
ISSN: 0368-492X

Keywords

Content available
Article
Publication date: 28 January 2020

Christos Papaleonidas, Dimitrios V. Lyridis, Alexios Papakostas and Dimitris Antonis Konstantinidis

The purpose of this paper is to improve the tactical planning of the stakeholders of the midstream liquefied natural gas (LNG) supply chain, using an optimisation approach. The…

1302

Abstract

Purpose

The purpose of this paper is to improve the tactical planning of the stakeholders of the midstream liquefied natural gas (LNG) supply chain, using an optimisation approach. The results can contribute to enhance the proactivity on significant investment decisions.

Design/methodology/approach

A decision support tool (DST) is proposed to minimise the operational cost of a fleet of vessels. Mixed integer linear programming (MILP) used to perform contract assignment combined with a genetic algorithm solution are the foundations of the DST. The aforementioned methods present a formulation of the maritime transportation problem from the scope of tramp shipping companies.

Findings

The validation of the DST through a realistic case study illustrates its potential in generating quantitative data about the cost of the midstream LNG supply chain and the annual operations schedule for a fleet of LNG vessels.

Research limitations/implications

The LNG transportation scenarios included assumptions, which were required for resource reasons, such as omission of stochasticity. Notwithstanding the assumptions made, it is to the authors’ belief that the paper meets its objectives as described above.

Practical implications

Potential practitioners may exploit the results to make informed decisions on the operation of LNG vessels, charter rate quotes and/or redeployment of existing fleet.

Originality/value

The research has a novel approach as it combines the creation of practical management tool, with a comprehensive mathematical modelling, for the midstream LNG supply chain. Quantifying future fleet costs is an alternative approach, which may improve the planning procedure of a tramp shipping company.

Details

Maritime Business Review, vol. 5 no. 1
Type: Research Article
ISSN: 2397-3757

Keywords

Content available
39

Abstract

Details

Kybernetes, vol. 37 no. 1
Type: Research Article
ISSN: 0368-492X

Content available
Book part
Publication date: 8 August 2022

Abstract

Details

Applications of Management Science
Type: Book
ISBN: 978-1-80071-552-3

Open Access
Article
Publication date: 11 March 2022

Andrei Khrennikov

This paper aims to present the basic assumptions for creation of social Fröhlich condensate and attract attention of other researchers (both from physics and socio-political…

Abstract

Purpose

This paper aims to present the basic assumptions for creation of social Fröhlich condensate and attract attention of other researchers (both from physics and socio-political science) to the problem of modeling of stability and order preservation in highly energetic society coupled with social energy bath of high temperature.

Design/methodology/approach

The model of social Fröhlich condensation and its analysis are based on the mathematical formalism of quantum thermodynamics and field theory (applied outside of physics).

Findings

The presented quantum-like model provides the consistent operational model of such complex socio-political phenomenon as Fröhlich condensation.

Research limitations/implications

The model of social Fröhlich condensation is heavily based on theory of open quantum systems. Its consistent elaboration needs additional efforts.

Practical implications

Evidence of such phenomenon as social Fröhlich condensation is demonstrated by stability of modern informationally open societies.

Social implications

Approaching the state of Fröhlich condensation is the powerful source of social stability. Understanding its informational structure and origin may help to stabilize the modern society.

Originality/value

Application of the quantum-like model of Fröhlich condensation in social and political sciences is really the novel and original approach to mathematical modeling of social stability in society exposed to powerful information radiation from mass-media and Internet-based sources.

1 – 10 of over 2000