Search results
1 – 10 of over 3000Abdoul Aziz Ndoye and Michel Lubrano
We provide a Bayesian inference for a mixture of two Pareto distributions which is then used to approximate the upper tail of a wage distribution. The model is applied to the data…
Abstract
We provide a Bayesian inference for a mixture of two Pareto distributions which is then used to approximate the upper tail of a wage distribution. The model is applied to the data from the CPS Outgoing Rotation Group to analyze the recent structure of top wages in the United States from 1992 through 2009. We find an enormous earnings inequality between the very highest wage earners (the “superstars”), and the other high wage earners. These findings are largely in accordance with the alternative explanations combining the model of superstars and the model of tournaments in hierarchical organization structure. The approach can be used to analyze the recent pay gaps among top executives in large firms so as to exhibit the “superstar” effect.
Details
Keywords
Muhammad Aslam, Abdur Razzaque Mughal and Munir Ahmad
The purpose of this paper is to propose the group acceptance sampling plans for when the lifetime of the submitted product follows the Pareto distribution.
Abstract
Purpose
The purpose of this paper is to propose the group acceptance sampling plans for when the lifetime of the submitted product follows the Pareto distribution.
Design/methodology/approach
The single‐point approach (only consumer's risk) is used to find the plan parameter of the proposed plan for specified values of consumer's risk, producer's risk, acceptance number, number of testers and experiment time.
Findings
Tables are constructed using the Poisson and the weighted Poisson distribution. Extensive tables are provided for practical use.
Research limitations/implications
The tables in this paper can be used only when the lifetime of a product follows the Pareto distribution of 2nd kind.
Practical implications
The result can be used to test the product to save cost and time of the experiment. The use of the weighted Poisson distribution provides the less group size (sample size) as than the plans in the literature.
Social implications
By implementing the proposed plan, the experiment cost can be minimized.
Originality/value
The novelty of this paper is that Poisson and the weighted Poisson distributions are used to find the plan parameter of the proposed plan instead of the binomial distribution when the lifetime of submitted product follows the Pareto distribution of 2nd kind.
Details
Keywords
The purpose of this paper is to explore whether the databases from a certain library are Pareto-compliant or not? If so, to what extent is the Pareto principle performance evident…
Abstract
Purpose
The purpose of this paper is to explore whether the databases from a certain library are Pareto-compliant or not? If so, to what extent is the Pareto principle performance evident among these databases? The other purpose is to determine the differences in Pareto principle performance according to time change and database type.
Design/methodology/approach
Data on full-text downloads from six e-resources – Elsevier ScienceDirect (SD), Wiley Blackwell, Springer Journal, EBSCO Business Source Premier (BSP), American Chemical Society and American Institute of Physics (AIP) – for the period 2007-2013 were analysed; 42 samples were collected from these databases. The proportion of frequently downloaded journals from databases was selected as an indicator to determine differences in Pareto principle performance according to time change. The difference between the proportion of frequently downloaded journals and the classic proportion of 20 per cent was used as indicator to determine difference in Pareto principle performance related to database type.
Findings
There are 33 samples (78.57 per cent) which exhibited the Pareto principle. Four databases – Elsevier SD, Wiley Blackwell, EBSCO BSP and AIP – constantly exhibited the Pareto principle. The differences were not significant according to time change. The two multi-discipline databases – Elsevier SD and Wiley Blackwell – fluctuated more moderately than the two single-discipline databases – EBSCO BSP and AIP. Multi-discipline and single-discipline databases showed some differences in Pareto principle performance; however, these differences were not remarkable.
Originality/value
The Pareto principle confirmed that there were frequent and infrequent downloads of e-journals from e-journal databases. It was of great importance to analyse these to improve digital resources acquisition and user service.
Details
Keywords
This study aims to estimate the firm size distributions that belong to the service sector and manufacturing sector in Korea.
Abstract
Purpose
This study aims to estimate the firm size distributions that belong to the service sector and manufacturing sector in Korea.
Design/methodology/approach
When estimating the firm size distribution, the author considers the following two major factors. First, the firm size distribution can have a gamma distribution rather than traditional accepted distributions such as Pareto distribution or log-normal distribution. In particular, industry-specific enterprises can have different size distributions of the type of gamma distribution. Second, the firm size distribution that is applied to this study’s data set should reflect a number of factors. For example, estimating mixture gamma distribution for firm size distribution should be required and compared, because the total amount of configuration data is composed of small businesses, medium-sized and large companies.
Findings
Using 8,230 number of firm data in 2013, the author estimates mixture gamma distribution for the firm size.
Originality/value
From the comparison, empirical results are found for the following characteristics of core firm size distribution: first, the firm size distribution of the manufacturing sector has a longer tail than firm size distribution of the service sector. Second, the manufacturing firm size distribution dominates the entire country firm size distribution. Third, one factor among the three factors that make up the mixed gamma firm size distribution is described for 99 per cent of the firm size distributions. From the estimated firm size distributions of the service sector and manufacturing sector in Korea, the author simply implies the strategy and policy implications for the start-up firm.
Details
Keywords
Kuo‐Ching Chiou and Lee‐Ing Tong
Reliability engineers must not only consider the consumption of energy, capital and material resources, but also seek more economic means of completing experiments effectively…
Abstract
Reliability engineers must not only consider the consumption of energy, capital and material resources, but also seek more economic means of completing experiments effectively. This study derives formulae for computing ratios of expected type‐II censoring times and expected complete sampling times when the lifetime adheres to two‐parameter Pareto and Rayleigh distributions. Utilizing such formulae allows the construction of tables providing information about how much experiment time can be saved by employing a type‐II censoring plan instead of a complete sampling plan. Engineers can employ the proposed tables to determine the censoring number, the initial sample size and the other relevant parameters for reducing the total experiment time. Illustrative examples demonstrate the effectiveness of the proposed procedure.
Details
Keywords
Purpose – The purpose of this chapter is to project the global emergence of megacities through the 21st century using population scenarios consistent with the Special Report on…
Abstract
Purpose – The purpose of this chapter is to project the global emergence of megacities through the 21st century using population scenarios consistent with the Special Report on Emissions Scenarios (SRES) of the Intergovernmental Panel on Climate Change (IPCC).
Methodology – A dynamic urban growth model is developed based on a scale-independent theory of growing networks taking into consideration the geographical and climatic suitability of the location of cities. The model is able to generate a series of megacity projections consistent with an experimental city size distribution based on a national urban population scenario consistent with Zipf's law. The model is applied to population projections for 45,316 cities around the world using three population scenarios from SRES.
Findings – All of the projections indicate that a large number of megacities will be generated in developing regions towards 2100, although the range is wide and depends on the population assumed in the scenarios. Some results indicate an extreme population concentration in megacities; this might be undesirable for national security, quality of life, and sustainable development. Transport policies affect urban growth and national land development through changes in mobility and accessibility across the nation.
Implications – The results presented in this chapter could serve to stimulate discussions on urban and national transport policies and planning, particularly in China.
Details
Keywords
Renato de Siqueira Motta, Silvana Maria Bastos Afonso, Paulo Roberto Lyra and Ramiro Brito Willmersdorf
Optimization under a deterministic approach generally leads to a final design in which the performance may degrade significantly and/or constraints can be violated because of…
Abstract
Purpose
Optimization under a deterministic approach generally leads to a final design in which the performance may degrade significantly and/or constraints can be violated because of perturbations arising from uncertainties. The purpose of this paper is to obtain a better strategy that would obtain an optimum design which is less sensitive to changes in uncertain parameters. The process of finding these optima is referred to as robust design optimization (RDO), in which improvement of the performance and reduction of its variability are sought, while maintaining the feasibility of the solution. This overall process is very time consuming, requiring a robust tool to conduct this optimum search efficiently.
Design/methodology/approach
In this paper, the authors propose an integrated tool to efficiently obtain RDO solutions. The tool encompasses suitable multiobjective optimization (MO) techniques (encompassing: Normal-Boundary Intersection, Normalized Normal-Constraint, weighted sum method and min-max methods), a surrogate model using reduced order method for cheap function evaluations and adequate procedure for uncertainties quantification (Probabilistic Collocation Method).
Findings
To illustrate the application of the proposed tool, 2D structural problems are considered. The integrated tool prove to be very effective reducing the computational time by up to five orders of magnitude, when compared to the solutions obtained via classical standard approaches.
Originality/value
The proposed combination of methodologies described in the paper, leads to a very powerful tool for structural optimum designs, considering uncertainty parameters, that can be extended to deal with other class of applications.
Details
Keywords
Ashraf Norouzi and Amir Albadvi
Marketing/finance interface and application of its new insights in marketing decisions have recently found great interest among marketing researchers and practitioners. There is a…
Abstract
Purpose
Marketing/finance interface and application of its new insights in marketing decisions have recently found great interest among marketing researchers and practitioners. There is a relatively large body of marketing literature about incorporating modern portfolio theory (MPT) into customer portfolio context and taking advantage of it in marketing resource allocation decisions. Previous studies have modelled customer portfolio risk in the form of historical return/profitability volatility of customer base. However, the risk is a future-oriented measure, and deals with future volatility associated with return stream. This study aims to address this research problem.
Design/methodology/approach
The well-known Pareto/non-binomial distribution (NBD) approach is used to model customer purchases in a non-contractual setting of research practice. Then, the results were used to simulate the customers’ future buying behaviour and associated returns via the Monte Carlo simulation approach. Subsequently, the mean-variance portfolio optimization model was applied to find the optimal customer portfolio mix.
Findings
The results illustrated the better performance of the proposed efficient portfolio versus the current customer portfolio. These results are applicable in analyzing customer portfolio composition, and can be used as a guidance to make decisions about marketing resource allocation in different segments.
Originality/value
This study proposes a new approach to analyze customer portfolio by using the customers’ future buying behaviour. Taking advantage of rich marketing literature about statistical assumptions describing the customers’ buying behaviour, this study tries to take some steps forward in the application of the MPT theory in customer portfolio management context.
Details
Keywords
SERGIO M. FOCARDI and FRANK J. FABOZZI
Fat‐tailed distributions have been found in many financial and economic variables ranging from forecasting returns on financial assets to modeling recovery distributions in…
Abstract
Fat‐tailed distributions have been found in many financial and economic variables ranging from forecasting returns on financial assets to modeling recovery distributions in bankruptcies. They have also been found in numerous insurance applications such as catastrophic insurance claims and in value‐at‐risk measures employed by risk managers. Financial applications include:
The purpose of this paper is to develop a dynamic model to understand the evolution of the firm size distribution in developing countries.
Abstract
Purpose
The purpose of this paper is to develop a dynamic model to understand the evolution of the firm size distribution in developing countries.
Design/methodology/approach
Evidence points to the existence of a “missing middle” in the size distribution of firms in developing countries. In the model presented in this paper, the bimodality arises because of agents optimally selecting into a traditional and a modern sector. The key parameter in this model is the mean level of knowledge in the economy.
Findings
For a low mean, the two sectors co-exist. As the mean rises, the size distribution converges from a bimodal to an unimodal distribution.
Originality/value
Unlike existing explanations, this model does not rely on frictions to generate the bimodality in size distribution.
Details