Search results
1 – 10 of 122Julio Urenda and Olga Kosheleva
While the main purpose of reporting – e.g. reporting for taxes – is to gauge the economic state of a company, the fact that reporting is done at pre-determined dates distorts the…
Abstract
Purpose
While the main purpose of reporting – e.g. reporting for taxes – is to gauge the economic state of a company, the fact that reporting is done at pre-determined dates distorts the reporting results. For example, to create a larger impression of their productivity, companies fire temporary workers before the reporting date and re-hire then right away. The purpose of this study is to decide how to avoid such distortion.
Design/methodology/approach
This study aims to come up with a solution which is applicable for all possible reasonable optimality criteria. Thus, a general formalism for describing and analyzing all such criteria is used.
Findings
This study shows that most distortion problems will disappear if the fixed pre-determined reporting dates are replaced with individualized random reporting dates. This study also shows that for all reasonable optimality criteria, the optimal way to assign reporting dates is to do it uniformly.
Research limitations/implications
This study shows that for all reasonable optimality criteria, the optimal way to assign reporting dates is to do it uniformly.
Practical implications
It is found that the individualized random tax reporting dates would be beneficial for economy.
Social implications
It is found that the individualized random tax reporting dates would be beneficial for society as a whole.
Originality/value
This study proposes a new idea of replacing the fixed pre-determining reporting dates with randomized ones. On the informal level, this idea may have been proposed earlier, but what is completely new is our analysis of which randomization of reporting dates is the best for economy: it turns out that under all reasonable optimality criteria, uniform randomization works the best.
Details
Keywords
Wu Fuxiang and Cai Yue
At present, China’s industrial spatial layout faces the predicament of over-agglomeration of Eastern China industries and the near disintegration of industrial structure in the…
Abstract
Purpose
At present, China’s industrial spatial layout faces the predicament of over-agglomeration of Eastern China industries and the near disintegration of industrial structure in the central and western regions. The paper aims to discuss this issue.
Design/methodology/approach
Based on the perspective of differentiated inter-regional labor mobility, this paper constructed a model framework of quadratic sub-utility quasi-linear preference utility function, and conducted model deduction and numerical simulation on causal factors of this spatial imbalance along the two dimensions of individual and regional welfare.
Findings
The study finds that in the long run, industrial spatial layout imposes a certain threshold limit on the portfolio proportion of differentiated labor. The dilemma of China’s industrial spatial layout is attributable to the deviation of the market’s optimal agglomeration from the social optimal agglomeration, and to the disfunction of Eastern China’s role as an intermediary between the global and the domestic value chain.
Originality/value
To resolve this predicament of industrial layout, the unitary welfare compensation based on fiscal transfer payment has to be switched to a more comprehensive approach giving consideration to industrial rebalancing.
Details
Keywords
Rubel Amin, Bijay Prasad Kushwaha and Md Helal Miah
This paper examines the process optimization method of the online sales model of information product demand concerning the spillover effect. It illustrates the spillover effect…
Abstract
Purpose
This paper examines the process optimization method of the online sales model of information product demand concerning the spillover effect. It illustrates the spillover effect (SE) of online product demand compared with traditional market demand. Also, optimized the SE for the ethical and ordinary consumer.
Design/methodology/approach
This article primarily focused on two types of models for online marketing: one is wholesales, and another is the agency. Firstly, the wholesale and agency models without SE and the wholesale and agency models with SE are constructed, respectively, to realize the SE in different sales models. Secondly, online channel participants' optimal price, demand and profit under variant conditions are compared and analyzed. Finally, efficient supply chain theory is optimized for the decision-making of online marketing consumers using an equation-based comparative analysis method.
Findings
The study found that when SEs are not considered, stronger piracy regulation makes online channel participants more beneficial. When the positive SE is strong, it is detrimental to manufacturers. When SEs are not considered, online channel participants only reach Pareto in agency mode. Pareto optimality can be achieved in wholesale and agency modes when SEs are considered.
Originality/value
The research has practical implications for an effective supply chain model for online marketing. This is the first algorithm-based comparative study concerning theoretical spillover effect analysis in supply chain management.
Details
Keywords
Senyu Xu, Huajun Tang and Yuxin Huang
The purpose of this research is to investigate how to introduce a financing scheme to tackle the manufacturer's capital constraint problem, discuss the effects of data-driven…
Abstract
Purpose
The purpose of this research is to investigate how to introduce a financing scheme to tackle the manufacturer's capital constraint problem, discuss the effects of data-driven marketing (DDM) quality, cross-channel-return (CCR) rate and financing interest rate on the members' pricing and delivery-lead-time decisions and optimal performances, and analyzes `how to achieve the coordination within a dual-channel supply chain (DSC) by contract coordination.
Design/methodology/approach
This work establishes a DSC model with DDM, and the offline retailer can provide internal financing to the capital-constrained online manufacturer. The demand under the price is determined based on DDM quality, customer channel preference and delivery lead time. Then, combined with the Stackelberg game, the optimal pricing and delivery-lead-time decisions are discussed under the inconsistent and consistent pricing strategies with decentralized and centralized systems. Furthermore, it designs a manufacturer-revenue sharing contract to coordinate the members under the two pricing strategies.
Findings
(1) The increase of DDM quality will reduce the delivery-lead-time under the inconsistent or consistent pricing strategy and will push the selling prices; (2) The growth of the CCR rate will raise selling prices and extend the delivery-lead-time under the decentralized decision; (3) Under price competition, the offline selling price is higher than the online selling price when customers prefer the offline channel and vice versa; (4) The retailer and the manufacturer can achieve a win-win situation through a manufacturer-revenue sharing contract.
Originality/value
This paper contributes to the studies related to DSC by investigating pricing and delivery-lead-time decisions based on DDM, CCR, internal financing and supply chain contract and proposes some managerial implications.
Details
Keywords
Zabih Ghelichi, Monica Gentili and Pitu Mirchandani
This paper aims to propose a simulation-based performance evaluation model for the drone-based delivery of aid items to disaster-affected areas. The objective of the model is to…
Abstract
Purpose
This paper aims to propose a simulation-based performance evaluation model for the drone-based delivery of aid items to disaster-affected areas. The objective of the model is to perform analytical studies, evaluate the performance of drone delivery systems for humanitarian logistics and can support the decision-making on the operational design of the system – on where to locate drone take-off points and on assignment and scheduling of delivery tasks to drones.
Design/methodology/approach
This simulation model captures the dynamics and variabilities of the drone-based delivery system, including demand rates, location of demand points, time-dependent parameters and possible failures of drones’ operations. An optimization model integrated with the simulation system can update the optimality of drones’ schedules and delivery assignments.
Findings
An extensive set of experiments was performed to evaluate alternative strategies to demonstrate the effectiveness for the proposed optimization/simulation system. In the first set of experiments, the authors use the simulation-based evaluation tool for a case study for Central Florida. The goal of this set of experiments is to show how the proposed system can be used for decision-making and decision-support. The second set of experiments presents a series of numerical studies for a set of randomly generated instances.
Originality/value
The goal is to develop a simulation system that can allow one to evaluate performance of drone-based delivery systems, accounting for the uncertainties through simulations of real-life drone delivery flights. The proposed simulation model captures the variations in different system parameters, including interval of updating the system after receiving new information, demand parameters: the demand rate and their spatial distribution (i.e. their locations), service time parameters: travel times, setup and loading times, payload drop-off times and repair times and drone energy level: battery’s energy is impacted and requires battery change/recharging while flying.
Details
Keywords
Armin Mahmoodi, Milad Jasemi Zergani, Leila Hashemi and Richard Millar
The purpose of this paper is to maximize the total demand covered by the established additive manufacturing and distribution centers and maximize the total literal weight assigned…
Abstract
Purpose
The purpose of this paper is to maximize the total demand covered by the established additive manufacturing and distribution centers and maximize the total literal weight assigned to the drones.
Design/methodology/approach
Disaster management or humanitarian supply chains (HSCs) differ from commercial supply chains in the fact that the aim of HSCs is to minimize the response time to a disaster as compared to the profit maximization goal of commercial supply chains. In this paper, the authors develop a relief chain structure that accommodates emerging technologies in humanitarian logistics into the two phases of disaster management – the preparedness stage and the response stage.
Findings
Solving the model by the genetic and the cuckoo optimization algorithm (COA) and comparing the results with the ones obtained by The General Algebraic Modeling System (GAMS) clear that genetic algorithm overcomes other options as it has led to objective functions that are 1.6% and 24.1% better comparing to GAMS and COA, respectively.
Originality/value
Finally, the presented model has been solved with three methods including one exact method and two metaheuristic methods. Results of implementation show that Non-dominated sorting genetic algorithm II (NSGA-II) has better performance in finding the optimal solutions.
Details
Keywords
Christian Diego Alcocer, Julián Ortegón and Alejandro Roa
The relevance of present consumption bias on personal finance has been confirmed in several studies and has important theoretical and practical implications. It has important…
Abstract
Purpose
The relevance of present consumption bias on personal finance has been confirmed in several studies and has important theoretical and practical implications. It has important, measurable implications when analyzing commitment or self-control, adherence to healthy habits (e.g. exercising or dieting), procrastination tendencies or savings. The purpose of this paper is to contribute to our understanding of these issues by postulating a model of income uncertainty within a hyperbolic discounting framework that measures the cost of financial intertemporal inconsistencies related to this bias. The emphasis is on the analysis of this cost. We also propose experimental designs and consistent estimation methods, as well as agent-based modelling extensions.
Design/methodology/approach
The authors develop a finite-horizon model with hyperbolic preferences. Individuals have a present bias distinct from their discount rate so their choices face intertemporal inconsistencies. The authors further extend the analysis with uncertainty about future incomes. Specifically, individuals live for three periods, and the authors find the optimal consumption levels in the perfect-information benchmark by backward induction. They then proceed to add biases and uncertainty to characterize their implications and measure the costs of the intertemporal inconsistencies they cause.
Findings
The authors measure how an agent's utility is greater when they “tie their hands” than when they are free to re-evaluate and change their consumption schedule. This “cost of being vulnerable to falling into temptation” only depends (increasingly) on the measure of the present bias and (decreasingly) on the discount factor. They analyze the varying effects on utility and consumption of changes in impatience and optimism. They conclude by discussing theoretical and practical implications; they also propose agent-based simulations, as well as empirical and experimental designs, to further test the relevance and applications of the results.
Practical implications
This model has important, measurable implications when analyzing commitment or self-control, adherence to healthy habits (e.g. exercising or dieting), procrastination tendencies or savings.
Social implications
The results enhance the estimation of the costs of present biases such that employers can better identify the incentives required to acquire and retain human capital. The authors provide evidence that workers are vulnerable to contract renegotiations and about the need for a regulator that restores ex-ante efficiency. Similarly, in the private sector, firms could recognize the postulated consumer profiles and focus their resources on anxious, too-optimistic or potentially addictive consumers; this, again, provides some justification about the need for a regulator.
Originality/value
In traditional exponential discounting, the marginal rate of substitution of consumption between two points depends only on their distance; thus, it allows none of the intertemporal inconsistencies we often observe in real life. Therefore, hyperbolic discounting better fits the data. The authors model choice under uncertainty and focus on the costs caused when present biases (ex-post) push behaviour away from ex-ante optimality. They conclude by proposing experimental designs to further enhance the estimation and implications of these costs. The postulated refinements have the potential to improve previous analyses on commitment devices and commitment-related regulation.
Details
Keywords
Jiaming Wu and Xiaobo Qu
This paper aims to review the studies on intersection control with connected and automated vehicles (CAVs).
Abstract
Purpose
This paper aims to review the studies on intersection control with connected and automated vehicles (CAVs).
Design/methodology/approach
The most seminal and recent research in this area is reviewed. This study specifically focuses on two categories: CAV trajectory planning and joint intersection and CAV control.
Findings
It is found that there is a lack of widely recognized benchmarks in this area, which hinders the validation and demonstration of new studies.
Originality/value
In this review, the authors focus on the methodological approaches taken to empower intersection control with CAVs. The authors hope the present review could shed light on the state-of-the-art methods, research gaps and future research directions.
Details
Keywords
The purpose of this paper is to provide an outline of the major contributions in the literature on the determination of the least distance in data envelopment analysis (DEA). The…
Abstract
Purpose
The purpose of this paper is to provide an outline of the major contributions in the literature on the determination of the least distance in data envelopment analysis (DEA). The focus herein is primarily on methodological developments. Specifically, attention is mainly paid to modeling aspects, computational features, the satisfaction of properties and duality. Finally, some promising avenues of future research on this topic are stated.
Design/methodology/approach
DEA is a methodology based on mathematical programming for the assessment of relative efficiency of a set of decision-making units (DMUs) that use several inputs to produce several outputs. DEA is classified in the literature as a non-parametric method because it does not assume a particular functional form for the underlying production function and presents, in this sense, some outstanding properties: the efficiency of firms may be evaluated independently on the market prices of the inputs used and outputs produced; it may be easily used with multiple inputs and outputs; a single score of efficiency for each assessed organization is obtained; this technique ranks organizations based on relative efficiency; and finally, it yields benchmarking information. DEA models provide both benchmarking information and efficiency scores for each of the evaluated units when it is applied to a dataset of observations and variables (inputs and outputs). Without a doubt, this benchmarking information gives DEA a distinct advantage over other efficiency methodologies, such as stochastic frontier analysis (SFA). Technical inefficiency is typically measured in DEA as the distance between the observed unit and a “benchmarking” target on the estimated piece-wise linear efficient frontier. The choice of this target is critical for assessing the potential performance of each DMU in the sample, as well as for providing information on how to increase its performance. However, traditional DEA models yield targets that are determined by the “furthest” efficient projection to the evaluated DMU. The projected point on the efficient frontier obtained as such may not be a representative projection for the judged unit, and consequently, some authors in the literature have suggested determining closest targets instead. The general argument behind this idea is that closer targets suggest directions of enhancement for the inputs and outputs of the inefficient units that may lead them to the efficiency with less effort. Indeed, authors like Aparicio et al. (2007) have shown, in an application on airlines, that it is possible to find substantial differences between the targets provided by applying the criterion used by the traditional DEA models, and those obtained when the criterion of closeness is utilized for determining projection points on the efficient frontier. The determination of closest targets is connected to the calculation of the least distance from the evaluated unit to the efficient frontier of the reference technology. In fact, the former is usually computed through solving mathematical programming models associated with minimizing some type of distance (e.g. Euclidean). In this particular respect, the main contribution in the literature is the paper by Briec (1998) on Hölder distance functions, where formally technical inefficiency to the “weakly” efficient frontier is defined through mathematical distances.
Findings
All the interesting features of the determination of closest targets from a benchmarking point of view have generated, in recent times, the increasing interest of researchers in the calculation of the least distance to evaluate technical inefficiency (Aparicio et al., 2014a). So, in this paper, we present a general classification of published contributions, mainly from a methodological perspective, and additionally, we indicate avenues for further research on this topic. The approaches that we cite in this paper differ in the way that the idea of similarity is made operative. Similarity is, in this sense, implemented as the closeness between the values of the inputs and/or outputs of the assessed units and those of the obtained projections on the frontier of the reference production possibility set. Similarity may be measured through multiple distances and efficiency measures. In turn, the aim is to globally minimize DEA model slacks to determine the closest efficient targets. However, as we will show later in the text, minimizing a mathematical distance in DEA is not an easy task, as it is equivalent to minimizing the distance to the complement of a polyhedral set, which is not a convex set. This complexity will justify the existence of different alternatives for solving these types of models.
Originality/value
As we are aware, this is the first survey in this topic.
Details
Keywords
The authors introduce non-Ricardian (“hand-to-mouth”) myopic agents into an otherwise standard real-business-cycle (RBC) setup augmented with a detailed government sector. The…
Abstract
Purpose
The authors introduce non-Ricardian (“hand-to-mouth”) myopic agents into an otherwise standard real-business-cycle (RBC) setup augmented with a detailed government sector. The authors investigate the quantitative importance of the presence of nonoptimizing households for cyclical fluctuations in Bulgaria.
Design/methodology/approach
The authors calibrate the RBC model to Bulgarian data for the period following the introduction of the currency board arrangement (1999–2018).
Findings
The authors find that the inclusion of such non-Ricardian households improves model performance along several dimensions and generally provides a better match vis-a-vis data, as compared to the standard model populated with Ricardian agents only.
Originality/value
This is a novel finding in the macroeconomic studies on Bulgaria using modern quantitative methods.
Details