Search results

1 – 10 of over 7000
Article
Publication date: 2 August 2013

Kenichi Nakashima and Arvinder P.S. Loomba

The purpose of this study is to consider the acquisition of end‐of‐life products under variable quality consideration for remanufacturing so as to determine optimal control policy

Abstract

Purpose

The purpose of this study is to consider the acquisition of end‐of‐life products under variable quality consideration for remanufacturing so as to determine optimal control policy that minimizes per‐period expected costs that may guide future consideration by practitioners.

Design/methodology/approach

The authors review recent literature on reverse supply chains and remanufacturing. They utilize an undiscounted Markov decision process methodology to ascertain the order amount of remanufacturable products using optimal control under minimum cost criterion.

Findings

The authors conclude that it makes sense for firms to focus on the cost management with production control based on quality levels with different acquisition costs of remanufacturable products.

Research limitations/implications

Although the Markov decision process methodology – which is well supported in literature – was diligently followed, the nature of analysis and discussion may be subject to authors’ bias. Future investigation and adoption of the methodological approach used will verify the paper findings.

Practical implications

This study determines optimal control policy for ordering specific amount of product that minimizes per‐period expected costs for remanufacturing. Reverse supply‐chain professionals now have an easy‐to‐follow guide when acquiring end‐of‐life remanufacturable products alternatives with variable quality.

Social implications

This study determines the optimal policy for ordering remanufacturable products. This information enables practitioners to reduce their carbon footprint in reverse supply chain through inspection/sorting before remanufacturing by processing only the type, quality, and quantity of needed product.

Originality/value

For reverse supply chain to be taken seriously by senior management in firms, it is imperative that practitioners in this field synchronize their operational‐level ordering decisions with holistic cost minimization objective (to maximize value recovery) to stay viable.

Details

Journal of Advances in Management Research, vol. 10 no. 2
Type: Research Article
ISSN: 0972-7981

Keywords

Book part
Publication date: 15 July 2020

Charles M. Cameron, John M. de Figueiredo and David E. Lewis

We examine personnel policies and careers in public agencies, particularly how wages and promotion standards can partially offset a fundamental contracting problem: the inability…

Abstract

We examine personnel policies and careers in public agencies, particularly how wages and promotion standards can partially offset a fundamental contracting problem: the inability of public-sector workers to contract on performance, and the inability of political masters to contract on forbearance from meddling. Despite the dual contracting problem, properly constructed personnel policies can encourage intrinsically motivated public-sector employees to invest in expertise, seek promotion, remain in the public sector, and work hard. To do so requires internal personnel policies that sort “slackers” from “zealots.” Personnel policies that accomplish this task are quite different in agencies where acquired expertise has little value in the private sector, and agencies where acquired expertise commands a premium in the private sector. Even with well-designed personnel policies, an inescapable trade-off between political control and expertise acquisition remains.

Details

Employee Inter- and Intra-Firm Mobility
Type: Book
ISBN: 978-1-78973-550-5

Article
Publication date: 7 August 2017

Yolanda F. Rebollo-Sanz

The purpose of this paper is to show that for some key topics on labour economics such as the effect of seniority and job mobility in wages, it is important to explicitly consider…

Abstract

Purpose

The purpose of this paper is to show that for some key topics on labour economics such as the effect of seniority and job mobility in wages, it is important to explicitly consider firm fixed effects. The author also wants to test whether the importance of firm in explaining wage dispersion is higher or lower in Spain than in other European countries.

Design/methodology/approach

The author estimates an individual wage equation where firm and workers effects are considered and the estimation process control for censored wages. This exercise is performed for the Spanish economy over the course of a whole business cycle, i.e., 2000-2015.

Findings

The author demonstrates that Spanish firms contribute to explain around 27 per cent of the individual wage heterogeneity but more importantly around 74 per cent of inter-industry wage differentials. In both cases, this contribution is mainly related to large dispersion in firm’s wage policies. The process of positive sorting of workers across firms or industries does not play an important role. Interestingly, the importance of firm’s wage policies in explaining individual wage dispersion has increased over the current Big Recession.

Practical implications

The results confirm that firms set wages and, henceforth, are partially responsible for individual wage heterogeneity but more importantly for inter-industrial wage dispersion.

Originality/value

The exercise is performed under optimal conditions because the author uses a longitudinal matched employer-employee data set, observed wages are at a monthly frequency, and implements an estimation method suitable for censored models with two high-dimensional fixed effects. This is the first study that looks deeply into the role of firms in explaining wage heterogeneity at the individual and industry level in Spain and along the current Big Recession.

Article
Publication date: 2 May 2017

Trond Arne Borgersen

The purpose of this paper is twofold: first, it derives the optimal loan-to-value (LTV)-ratio for a mortgagor that maximizes the return to home equity when considering the capital…

Abstract

Purpose

The purpose of this paper is twofold: first, it derives the optimal loan-to-value (LTV)-ratio for a mortgagor that maximizes the return to home equity when considering the capital structure of housing investment. Second, it analyses the demand-side contribution to mortgage market variability across monetary policy regimes.

Design/methodology/approach

The paper endogenizes both the relation between the LTV ratio and the mortgage rate and the relation between LTV and the rate of appreciation. When we consider LTV-variance and the demand-side contribution to mortgage market variability, three stylized regimes is considered.

Findings

The paper finds an intuitive ranking of the optimal LTV-ratios across regimes, and the optimal LTV-ratio peaks during a housing boom. When, however, monetary policy ignores asset inflation the demand-side contribution to market variability is highest during normal market conditions. Hence, there is a potentially hump-shaped relation between the risk exposure of individual mortgagors and the demand-side contribution to mortgage market variability.

Originality/value

The paper finds a potentially hump-shaped relation between the risk exposure of individual mortgagors and the demand-side contribution to mortgage market variability, which, to the best of our knowledge, is novel. The paper shows how macro-prudential and monetary policy are complementary tolls for preserving financial stability.

Book part
Publication date: 13 May 2017

Hugo Jales and Zhengfei Yu

This chapter reviews recent developments in the density discontinuity approach. It is well known that agents having perfect control of the forcing variable will invalidate the…

Abstract

This chapter reviews recent developments in the density discontinuity approach. It is well known that agents having perfect control of the forcing variable will invalidate the popular regression discontinuity designs (RDDs). To detect the manipulation of the forcing variable, McCrary (2008) developed a test based on the discontinuity in the density around the threshold. Recent papers have noted that the sorting patterns around the threshold are often either the researcher’s object of interest or may relate to structural parameters such as tax elasticities through known functions. This, in turn, implies that the behavior of the distribution around the threshold is not only informative of the validity of a standard RDD; it can also be used to recover policy-relevant parameters and perform counterfactual exercises.

Details

Regression Discontinuity Designs
Type: Book
ISBN: 978-1-78714-390-6

Keywords

Article
Publication date: 22 February 2013

SiewMun Ha

Enhanced risk management through the application of mathematical optimization is the next competitive‐advantage frontier for the primary‐insurance industry. The widespread…

Abstract

Purpose

Enhanced risk management through the application of mathematical optimization is the next competitive‐advantage frontier for the primary‐insurance industry. The widespread adoption of catastrophe models for risk management provides the opportunity to exploit mathematical optimization techniques to achieve superior financial results over traditional methods of risk allocation. The purpose of this paper is to conduct a numerical experiment to evaluate the relative performances of the steepest‐ascent method and genetic algorithm in the solution of an optimal risk‐allocation problem in primary‐insurance portfolio management.

Design/methodology/approach

The performance of two well‐established optimization methods – steepest ascent and genetic algorithm – are evaluated by applying them to solve the problem of minimizing the catastrophe risk of a US book of policies while concurrently maintaining a minimum level of return.

Findings

The steepest‐ascent method was found to be functionally dependent on, but not overly sensitive to, choice of initial starting policy. The genetic algorithm produced a superior solution to the steepest‐ascent method at the cost of increased computation time.

Originality/value

The results provide practical guidelines for algorithm selection and implementation for the reader interested in constructing an optimal insurance portfolio from a set of available policies.

Details

The Journal of Risk Finance, vol. 14 no. 2
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 12 October 2015

Jorge Lara Alvarez

The data employed to measure income inequality usually come from household surveys, which commonly suffer from atypical observations such as outliers and contamination points…

Abstract

Purpose

The data employed to measure income inequality usually come from household surveys, which commonly suffer from atypical observations such as outliers and contamination points. This is of importance since a single atypical observation can make classical inequality indices totally uninformative. To deal with this problem, robust univariate parametric or ad hoc procedures are commonly used; however, neither is fully satisfactory. The purpose of this paper is to propose a methodology to deal with this problem.

Design/methodology/approach

The author propose two robust procedures to estimate inequality indices that can use all the information from a data set, and neither of them rely on a parametric distributional assumption. The methodology performs well irrespectively of the size and quality of the data set.

Findings

Applying these methods to household data for UK (1979) and Mexico (2006 and 2011), the author find that for UK data the Gini, Coefficient of Variation and Theil Inequality Indices are over estimated by between 0.02 and 0.04, while in the case of Mexico the same indices are over estimated more deeply, between 0.1 and almost 0.4. The relevance of including atypical observations that follow the linear pattern of the data are shown using the data from Mexico (2011).

Research limitations/implications

The methodology has two main limitations: the procedures are not able to identify a bad leverage outlier from a contamination point; and in the case that the data has no atypical observations, the procedures will tag as atypical a very small fraction of observations.

Social implications

A reduction in the estimate of inequality has important consequences from a policy maker perspective. First, ceteris paribus, the optimal amount of resources destinated to directly address inequality/poverty. Those “extra” resources can be destinated to promote growth. Notice that this is a direct consequence of having a more egalitarian economy than previously thought, this is due to the fact that poor people will actually enjoy a bigger share of any national income increment. This also implies that, in order to reduce poverty, public policies should focus more on economic growth.

Originality/value

To the knowledge, in the inequality literature this is the first methodology that is able to identify outliers and contamination points in more than one direction. That is, not only at the tails of the distribution, but on the whole marginal distribution of income. This is possible via the use of other variables related to income.

Details

International Journal of Social Economics, vol. 42 no. 10
Type: Research Article
ISSN: 0306-8293

Keywords

Article
Publication date: 5 December 2022

Azam Najafgholinejad

This study aims to evaluate the usability and information architecture of the digital library (DL) website of the National Library and Archives of Iran (NLAI).

Abstract

Purpose

This study aims to evaluate the usability and information architecture of the digital library (DL) website of the National Library and Archives of Iran (NLAI).

Design/methodology/approach

This applied study used an exploratory mixed method, card sorting. Data were collected by interviewing, observation, usability test and card sorting. By interviewing users about problems of the DL, eight tasks were predefined and users’ problems in the path were identified. Their satisfaction of the tasks and the usability rates were measured via a questionnaire. Card sorting was done to inform on the information organization of the website elements. The study population included all users of the DL of the NLAI in two groups: ten initiator users (public users) and ten expert users (librarians). SPSS was used for analysing the usability test quantitative data. MaxQda was applied for analysing interview-driven qualitative data. Qualitative content analysis, categorization (data organization and grouping) and determining main and secondary codes were applied as well. The sort optimal application was used for analysing card sorting data in the form of similarity matrix and dendrogram. For validating qualitative findings, triangulation was used. The internal reliability of the used questionnaire amounted to a = 0.87

Findings

Regarding the assigned tasks, new initiator users by consuming 367.67 s for registering and new expert users by consuming 403 s spent the most time. Task 2 ranked first in being incomplete among 40% of initiator users and Task 3 ranked first by 30% of incompleteness. Expert users had more unsuccessful attempts. Task 5 with the mean rate of 3.35 and Task 8 with the mean rate of 2.25 were the most difficult and the easiest tasks, respectively. Some usability components were rated lower than the moderate point. Only 30% of initiator users and 10% of expert users were satisfied with the website. A total of 12 categories and 452 codes were identified as main problems of the DL. The problems related to a vague perception of concepts and labels with 90 repetitions and digital source display with six repetitions ranked as the first and the last problems in working with the DL, respectively. The sort optimal package produced the card sorting results as a matrix similarity and a dendrogram. Card sorting reflected some changes in organizing information items. Interviews after card sorting emerged some new groups to be included, such as links to other digital libraries, shared databases in the organization and frequently asked questions.

Originality/value

The library’s website should be designed in a manner so that it can satisfy users with different traits. As the information technologies are increasingly developing, the importance of such a design increases for better service provision and effective competition.

Details

The Electronic Library, vol. 41 no. 1
Type: Research Article
ISSN: 0264-0473

Keywords

Article
Publication date: 5 April 2023

Chunqiu Xu, Fengzhi Liu, Yanjie Zhou, Runliang Dou, Xuehao Feng and Bo Shen

This paper aims to find optimal emission reduction investment strategies for the manufacturer and examine the effects of carbon cap-and-trade policy and uncertain low-carbon…

Abstract

Purpose

This paper aims to find optimal emission reduction investment strategies for the manufacturer and examine the effects of carbon cap-and-trade policy and uncertain low-carbon preferences on emission reduction investment strategies.

Design/methodology/approach

This paper studied a supply chain consisting of one manufacturer and one retailer, in which the manufacturer is responsible for emission reduction investment. The manufacturer has two emission reduction investment strategies: (1) invest in traditional emission reduction technologies only in the production process and (2) increase investment in smart supply chain technologies in the use process. Then, three different Stackelberg game models are developed to explore the benefits of the manufacturer in different cases. Finally, this paper coordinates between the manufacturer and the retailer by developing a revenue-sharing contract.

Findings

The manufacturer's optimal emission reduction strategy is dynamic. When consumers' low-carbon preferences are low and the government implements a carbon cap-and-trade policy, the manufacturer can obtain the highest profit by increasing the emission reduction investment in the use process. The carbon cap-and-trade policy can encourage the manufacturer to reduce emissions only when the initial carbon emission is low. The emission reduction, order quantity and the manufacturer's profit increase with the consumers' low-carbon preferences. And the manufacturer can adjust the emission reduction investment according to the emission reduction cost coefficient in two processes.

Originality/value

This paper considers the investment of emission reduction technologies in different processes and provides theoretical guidance for manufacturers to make a low-carbon transformation. Furthermore, the paper provides suggestions for governments to effectively implement carbon cap-and-trade policy.

Details

Industrial Management & Data Systems, vol. 123 no. 10
Type: Research Article
ISSN: 0263-5577

Keywords

Article
Publication date: 12 October 2020

Md. Rakibul Hasan, Abu Hashan Md Mashud, Yosef Daryanto and Hui Ming Wee

External factors such as improper handling, extreme weather and insect attacks affect product quality. It is most obvious in fruit products which have a high deterioration rate…

Abstract

Purpose

External factors such as improper handling, extreme weather and insect attacks affect product quality. It is most obvious in fruit products which have a high deterioration rate. Moreover, decaying fruits will increase the deteriorating of other good ones. The purpose of this study is to derive the optimal pricing and replenishment decisions for agricultural products considering the effect of external factors that induce deterioration.

Design/methodology/approach

In this paper, the study investigates ways to reduce the product deterioration rate by separating the near defective items from the other good products and accelerating the quick sales of the near defective items at a discounted price. The objective is to maximize the total profit by optimizing the selling price and the replenishment cycles. Two scenarios are investigated. In the first scenario, the retailer offers a selling price discount for near defective products to stimulate customer demand. In the second scenario, the retailer does not offer such discounts.

Findings

An algorithm to solve the model is derived. Further, numerical examples are developed to compare the total profit for the two scenarios. Theoretical derivations and graphical results show the concavity of the profit function. Finally, the sensitivity analysis shows that the total profit of the discount model is higher.

Originality/value

This study contributes to a new pricing and inventory decision model. The research provides insights to retailers on making optimal pricing and replenishment decisions for non-instantaneous deterioration items, as well as reducing the external factors that influence higher deterioration rate through separating good products from the near defective ones which are sold at a discount to induce the sale.

1 – 10 of over 7000