Search results

1 – 10 of over 11000
Open Access
Article
Publication date: 30 June 2010

Hwa-Joong Kim, Eun-Kyung Yu, Kwang-Tae Kim and Tae-Seung Kim

Dynamic lot sizing is the problem of determining the quantity and timing of ordering items while satisfying the demand over a finite planning horizon. This paper considers the…

Abstract

Dynamic lot sizing is the problem of determining the quantity and timing of ordering items while satisfying the demand over a finite planning horizon. This paper considers the problem with two practical considerations: minimum order size and lost sales. The minimum order size is the minimum amount of items that should be purchased and lost sales involve situations in which sales are lost because items are not on hand or when it becomes more economical to lose the sale rather than making the sale. The objective is to minimize the costs of ordering, item , inventory holding and lost sale over the planning horizon. To solve the problem, we suggest a heuristic algorithm by considering trade-offs between cost factors. Computational experiments on randomly generated test instances show that the algorithm quickly obtains near-optimal solutions.

Details

Journal of International Logistics and Trade, vol. 8 no. 1
Type: Research Article
ISSN: 1738-2122

Keywords

Open Access
Article
Publication date: 19 September 2023

Cleyton Farias and Marcelo Silva

The authors explore the hypothesis that some movements in commodity prices are anticipated (news shocks) and can trigger aggregate fluctuations in small open emerging economies…

Abstract

Purpose

The authors explore the hypothesis that some movements in commodity prices are anticipated (news shocks) and can trigger aggregate fluctuations in small open emerging economies. This paper aims to discuss the aforementioned objective.

Design/methodology/approach

The authors build a multi-sector dynamic stochastic general equilibrium model with endogenous commodity production. There are five exogenous processes: a country-specific interest rate shock that responds to commodity price fluctuations, a productivity (TFP) shock for each sector and a commodity price shock. Both TFP and commodity price shocks are composed of unanticipated and anticipated components.

Findings

The authors show that news shocks to commodity prices lead to higher output, investment and consumption, and a countercyclical movement in the trade-balance-to-output ratio. The authors also show that commodity price news shocks explain about 24% of output aggregate fluctuations in the small open economy.

Practical implications

Given the importance of both anticipated and unanticipated commodity price shocks, policymakers should pay attention to developments in commodity markets when designing policies to attenuate the business cycles. Future research should investigate the design of optimal fiscal and monetary policies in SOE subject to news shocks in commodity prices.

Originality/value

This paper contributes to the knowledge of the sources of fluctuations in emerging economies highlighting the importance of a new source: news shocks in commodity prices.

Details

EconomiA, vol. 24 no. 2
Type: Research Article
ISSN: 1517-7580

Keywords

Open Access
Article
Publication date: 22 November 2022

Kedong Yin, Yun Cao, Shiwei Zhou and Xinman Lv

The purposes of this research are to study the theory and method of multi-attribute index system design and establish a set of systematic, standardized, scientific index systems…

Abstract

Purpose

The purposes of this research are to study the theory and method of multi-attribute index system design and establish a set of systematic, standardized, scientific index systems for the design optimization and inspection process. The research may form the basis for a rational, comprehensive evaluation and provide the most effective way of improving the quality of management decision-making. It is of practical significance to improve the rationality and reliability of the index system and provide standardized, scientific reference standards and theoretical guidance for the design and construction of the index system.

Design/methodology/approach

Using modern methods such as complex networks and machine learning, a system for the quality diagnosis of index data and the classification and stratification of index systems is designed. This guarantees the quality of the index data, realizes the scientific classification and stratification of the index system, reduces the subjectivity and randomness of the design of the index system, enhances its objectivity and rationality and lays a solid foundation for the optimal design of the index system.

Findings

Based on the ideas of statistics, system theory, machine learning and data mining, the focus in the present research is on “data quality diagnosis” and “index classification and stratification” and clarifying the classification standards and data quality characteristics of index data; a data-quality diagnosis system of “data review – data cleaning – data conversion – data inspection” is established. Using a decision tree, explanatory structural model, cluster analysis, K-means clustering and other methods, classification and hierarchical method system of indicators is designed to reduce the redundancy of indicator data and improve the quality of the data used. Finally, the scientific and standardized classification and hierarchical design of the index system can be realized.

Originality/value

The innovative contributions and research value of the paper are reflected in three aspects. First, a method system for index data quality diagnosis is designed, and multi-source data fusion technology is adopted to ensure the quality of multi-source, heterogeneous and mixed-frequency data of the index system. The second is to design a systematic quality-inspection process for missing data based on the systematic thinking of the whole and the individual. Aiming at the accuracy, reliability, and feasibility of the patched data, a quality-inspection method of patched data based on inversion thought and a unified representation method of data fusion based on a tensor model are proposed. The third is to use the modern method of unsupervised learning to classify and stratify the index system, which reduces the subjectivity and randomness of the design of the index system and enhances its objectivity and rationality.

Details

Marine Economics and Management, vol. 5 no. 2
Type: Research Article
ISSN: 2516-158X

Keywords

Open Access
Article
Publication date: 31 December 2003

Young ll Park and Seung Moon

The 1997-98 financial crisis has had a profound effect on how East Asian economies the role of the IMF and its strategic interests relative to those of the United States in the…

Abstract

The 1997-98 financial crisis has had a profound effect on how East Asian economies the role of the IMF and its strategic interests relative to those of the United States in the international financial regime. It has prompted them to create a regional mechanism for financial and monetary cooperation, ranging from deeper policy dialogue and surveillance, to a system of financial cooperation, and common exchange rate arrangements. This paper analyses the economic and strategic motivations behind this and outlines recent developments in financial cooperation in East Asia to provide possible directions for the future.

A network of bilateral swap arrangements under the Chiang Mai Initiative(CMI) needs stronger policy dialogue and surveillance to develop into a regional financing facility, a sort of East Asian IMF. The facility plays a role as an regional lender of last resort, providing short-term funds to a member country facing a temporary liquidity shortage and for market intervention to stabilize foreign exchange rate. East Asian countries need to achieve regional exchange rate stability. In the long run, the region may develop a common currency arrangement, but it cannot be expected in the very near future because there is no convergence of macroeconomic conditions, economic structure and systems. A realistic approach would be for East Asian developing countries to adopt a currency basket system to minimize the impact of dollar/yen exchange rate volatility on their economies. Strong political will and a vision for regional integration will be required to introduce it.

Details

Journal of International Logistics and Trade, vol. 1 no. 1
Type: Research Article
ISSN: 1738-2122

Open Access
Article
Publication date: 1 October 2018

Xunjia Zheng, Bin Huang, Daiheng Ni and Qing Xu

The purpose of this paper is to accurately capture the risks which are caused by each road user in time.

2809

Abstract

Purpose

The purpose of this paper is to accurately capture the risks which are caused by each road user in time.

Design/methodology/approach

The authors proposed a novel risk assessment approach based on the multi-sensor fusion algorithm in the real traffic environment. Firstly, they proposed a novel detection-level fusion approach for multi-object perception in dense traffic environment based on evidence theory. This approach integrated four states of track life into a generic fusion framework to improve the performance of multi-object perception. The information of object type, position and velocity was accurately obtained. Then, they conducted several experiments in real dense traffic environment on highways and urban roads, which enabled them to propose a novel road traffic risk modeling approach based on the dynamic analysis of vehicles in a variety of driving scenarios. By analyzing the generation process of traffic risks between vehicles and the road environment, the equivalent forces of vehicle–vehicle and vehicle–road were presented and theoretically calculated. The prediction steering angle and trajectory were considered in the determination of traffic risk influence area.

Findings

The results of multi-object perception in the experiments showed that the proposed fusion approach achieved low false and missing tracking, and the road traffic risk was described as a field of equivalent force. The results extend the understanding of the traffic risk, which supported that the traffic risk from the front and back of the vehicle can be perceived in advance.

Originality/value

This approach integrated four states of track life into a generic fusion framework to improve the performance of multi-object perception. The information of object type, position and velocity was used to reduce erroneous data association between tracks and detections. Then, the authors conducted several experiments in real dense traffic environment on highways and urban roads, which enabled them to propose a novel road traffic risk modeling approach based on the dynamic analysis of vehicles in a variety of driving scenarios. By analyzing the generation process of traffic risks between vehicles and the road environment, the equivalent forces of vehicle–vehicle and vehicle–road were presented and theoretically calculated.

Details

Journal of Intelligent and Connected Vehicles, vol. 1 no. 2
Type: Research Article
ISSN: 2399-9802

Keywords

Open Access
Article
Publication date: 8 December 2020

Aleksandar Vasilev

The author augments an otherwise standard business-cycle model with a rich government sector and adds monopolistic competition in the product market and rigid prices, as well as…

1752

Abstract

Purpose

The author augments an otherwise standard business-cycle model with a rich government sector and adds monopolistic competition in the product market and rigid prices, as well as rigid wages a la Calvo (1983) in the labor market.

Design/methodology/approach

This specification with the nominal wage rigidity, when calibrated to Bulgarian data after the introduction of the currency board (1999–2018), allows the framework to reproduce better observed variability and correlations among model variables and those characterizing the labor market in particular.

Findings

As nominal wage frictions are incorporated, the variables become more persistent, especially output, capital stock, investment and consumption, which help the model match data better, as compared to a setup without rigidities.

Practical implications

The findings suggest that technology shocks seem to be the dominant source of economic fluctuations, but nominal wage rigidities as well as the monopolistic competition in the product market, might be important factors of relevance to the labor market dynamics in Bulgaria, and such imperfections should be incorporated in any model that studies cyclical movements in employment and wages.

Originality/value

The computational experiments performed in this paper suggest that wage rigidities are a quantitatively important model ingredient, which should be taken into consideration when analyzing the effects of different policies in Bulgaria, which is a novel result.

Details

Journal of Economics and Development, vol. 24 no. 1
Type: Research Article
ISSN: 1859-0020

Keywords

Open Access
Article
Publication date: 2 December 2016

Juan Aparicio

The purpose of this paper is to provide an outline of the major contributions in the literature on the determination of the least distance in data envelopment analysis (DEA). The…

2229

Abstract

Purpose

The purpose of this paper is to provide an outline of the major contributions in the literature on the determination of the least distance in data envelopment analysis (DEA). The focus herein is primarily on methodological developments. Specifically, attention is mainly paid to modeling aspects, computational features, the satisfaction of properties and duality. Finally, some promising avenues of future research on this topic are stated.

Design/methodology/approach

DEA is a methodology based on mathematical programming for the assessment of relative efficiency of a set of decision-making units (DMUs) that use several inputs to produce several outputs. DEA is classified in the literature as a non-parametric method because it does not assume a particular functional form for the underlying production function and presents, in this sense, some outstanding properties: the efficiency of firms may be evaluated independently on the market prices of the inputs used and outputs produced; it may be easily used with multiple inputs and outputs; a single score of efficiency for each assessed organization is obtained; this technique ranks organizations based on relative efficiency; and finally, it yields benchmarking information. DEA models provide both benchmarking information and efficiency scores for each of the evaluated units when it is applied to a dataset of observations and variables (inputs and outputs). Without a doubt, this benchmarking information gives DEA a distinct advantage over other efficiency methodologies, such as stochastic frontier analysis (SFA). Technical inefficiency is typically measured in DEA as the distance between the observed unit and a “benchmarking” target on the estimated piece-wise linear efficient frontier. The choice of this target is critical for assessing the potential performance of each DMU in the sample, as well as for providing information on how to increase its performance. However, traditional DEA models yield targets that are determined by the “furthest” efficient projection to the evaluated DMU. The projected point on the efficient frontier obtained as such may not be a representative projection for the judged unit, and consequently, some authors in the literature have suggested determining closest targets instead. The general argument behind this idea is that closer targets suggest directions of enhancement for the inputs and outputs of the inefficient units that may lead them to the efficiency with less effort. Indeed, authors like Aparicio et al. (2007) have shown, in an application on airlines, that it is possible to find substantial differences between the targets provided by applying the criterion used by the traditional DEA models, and those obtained when the criterion of closeness is utilized for determining projection points on the efficient frontier. The determination of closest targets is connected to the calculation of the least distance from the evaluated unit to the efficient frontier of the reference technology. In fact, the former is usually computed through solving mathematical programming models associated with minimizing some type of distance (e.g. Euclidean). In this particular respect, the main contribution in the literature is the paper by Briec (1998) on Hölder distance functions, where formally technical inefficiency to the “weakly” efficient frontier is defined through mathematical distances.

Findings

All the interesting features of the determination of closest targets from a benchmarking point of view have generated, in recent times, the increasing interest of researchers in the calculation of the least distance to evaluate technical inefficiency (Aparicio et al., 2014a). So, in this paper, we present a general classification of published contributions, mainly from a methodological perspective, and additionally, we indicate avenues for further research on this topic. The approaches that we cite in this paper differ in the way that the idea of similarity is made operative. Similarity is, in this sense, implemented as the closeness between the values of the inputs and/or outputs of the assessed units and those of the obtained projections on the frontier of the reference production possibility set. Similarity may be measured through multiple distances and efficiency measures. In turn, the aim is to globally minimize DEA model slacks to determine the closest efficient targets. However, as we will show later in the text, minimizing a mathematical distance in DEA is not an easy task, as it is equivalent to minimizing the distance to the complement of a polyhedral set, which is not a convex set. This complexity will justify the existence of different alternatives for solving these types of models.

Originality/value

As we are aware, this is the first survey in this topic.

Details

Journal of Centrum Cathedra, vol. 9 no. 2
Type: Research Article
ISSN: 1851-6599

Keywords

Open Access
Article
Publication date: 28 July 2021

Amir Rahimzadeh Dehaghani, Muhammad Nawaz, Rohullah Sultanie and Tawiah Kwatekwei Quartey-Papafio

This research studies a location-allocation problem considering the m/m/m/k queue model in the blood supply chain network. This supply chain includes three levels of suppliers or…

1907

Abstract

Purpose

This research studies a location-allocation problem considering the m/m/m/k queue model in the blood supply chain network. This supply chain includes three levels of suppliers or donors, main blood centers (laboratories for separation, storage and distribution centers) and demand centers (hospitals and private clinics). Moreover, the proposed model is a multi-objective model including minimizing the total cost of the blood supply chain (the cost of unmet demand and inventory spoilage, the cost of transport between collection centers and the main centers of blood), minimizing the waiting time of donors in blood donating mobile centers, and minimizing the establishment of mobile centers in potential places.

Design/methodology/approach

Since the problem is multi-objective and NP-Hard, the heuristic algorithm NSGA-II is proposed for Pareto solutions and then the estimation of the parameters of the algorithm is described using the design of experiments. According to the review of the previous research, there are a few pieces of research in the blood supply chain in the field of design queue models and there were few works that tried to use these concepts for designing the blood supply chain networks. Also, in former research, the uncertainty in the number of donors, and also the importance of blood donors has not been considered.

Findings

A novel mathematical model guided by the theory of linear programming has been proposed that can help health-care administrators in optimizing the blood supply chain networks.

Originality/value

By building upon solid literature and theory, the current study proposes a novel model for improving the supply chain of blood.

Details

Modern Supply Chain Research and Applications, vol. 3 no. 3
Type: Research Article
ISSN: 2631-3871

Keywords

Open Access
Article
Publication date: 31 December 2003

Hyung Do Ahn and Hong Shik Lee

The real costs of trade, the transport and other costs of doing business internationally, are very important determinants of a country's ability to participate fully in the world…

Abstract

The real costs of trade, the transport and other costs of doing business internationally, are very important determinants of a country's ability to participate fully in the world economy. Remoteness and poor transport and communications infrastructure isolate countries, inhibiting their participation in global production networks. This paper investigates the dependence of transport costs on geography and infrastructure It shows that infrastructure is quantitatively important in determining transport costs, and improvements in infrastructure can dramatically increase trade flows. It also finds that the low level of Northeast Asian countries' trade flows is largely due to poor infrastructure. Competition among countries in East Asia to maintain or become a logistic hub in the region is severe. This is reflected in the competition to build or expand airports and seaports in the region. Competing countries need to find ways of cooperating to achieve an efficient resource allocation in the region as a whole.

Details

Journal of International Logistics and Trade, vol. 1 no. 1
Type: Research Article
ISSN: 1738-2122

Open Access
Article
Publication date: 2 April 2019

Orlando Gomes and João Frade

This paper aims to provide an overall review and assessment of the virtues and flaws of decentralized self-regulated markets, discussing in particular the extent to which…

1139

Abstract

Purpose

This paper aims to provide an overall review and assessment of the virtues and flaws of decentralized self-regulated markets, discussing in particular the extent to which deceiving attitudes by some market participants might be potentially diluted and contradicted.

Design/methodology/approach

To approach deception and morality in markets, the paper follows two paths. First, the relevant recent literature on the theme is reviewed, examined and debated, and second, one constructs a simulation model equipped with the required elements to discuss the immediate and long-term impacts of deceiving behaviour over market outcomes.

Findings

The discussion and the model allow for highlighting the main drivers of the purchasing decisions of consumers and for evaluating how they react to manipulating behaviour by firms in the market. Agents pursuing short-run gains through unfair market practices are likely to be punished as fooled agents spread the word about the malpractices they were allegedly subject to.

Research limitations/implications

Markets are complex entities, where large numbers of individual agents typically establish local and direct contact with one another. These agents differ in many respects and interact in unpredictable ways. Assembling a concise model capable of addressing such complexity is a difficult task. The framework proposed in this paper points in the intended direction.

Originality/value

The debate in this paper contributes to a stronger perception on the mechanisms that attribute robustness and vitality to markets.

Details

Journal of Economics, Finance and Administrative Science, vol. 24 no. 48
Type: Research Article
ISSN: 2077-1886

Keywords

1 – 10 of over 11000