Search results
1 – 10 of 247Daniel de Abreu Pereira Uhr, Mikael Jhordan Lacerda Cordeiro and Júlia Gallego Ziero Uhr
This research assesses the economic impact of biomass plant installations on Brazilian municipalities, focusing on (1) labor income, (2) sectoral labor income and (3) income…
Abstract
Purpose
This research assesses the economic impact of biomass plant installations on Brazilian municipalities, focusing on (1) labor income, (2) sectoral labor income and (3) income inequality.
Design/methodology/approach
Municipal data from the Annual Social Information Report, the National Electric Energy Agency and the National Institute of Meteorology spanning 2002 to 2020 are utilized. The Synthetic Difference-in-Differences methodology is employed for empirical analysis, and robustness checks are conducted using the Doubly Robust Difference in Differences and the Double/Debiased Machine Learning methods.
Findings
The findings reveal that biomass plant installations lead to an average annual increase of approximately R$688.00 in formal workers' wages and reduce formal income inequality, with notable benefits observed for workers in the industry and agriculture sectors. The robustness tests support and validate the primary results, highlighting the positive implications of renewable energy integration on economic development in the studied municipalities.
Originality/value
This article represents a groundbreaking contribution to the existing literature as it pioneers the identification of the impact of biomass plant installation on formal employment income and local economic development in Brazil. To the best of our knowledge, this study is the first to uncover such effects. Moreover, the authors comprehensively examine sectoral implications and formal income inequality.
Details
Keywords
Christine Amsler, Robert James, Artem Prokhorov and Peter Schmidt
The traditional predictor of technical inefficiency proposed by Jondrow, Lovell, Materov, and Schmidt (1982) is a conditional expectation. This chapter explores whether, and by…
Abstract
The traditional predictor of technical inefficiency proposed by Jondrow, Lovell, Materov, and Schmidt (1982) is a conditional expectation. This chapter explores whether, and by how much, the predictor can be improved by using auxiliary information in the conditioning set. It considers two types of stochastic frontier models. The first type is a panel data model where composed errors from past and future time periods contain information about contemporaneous technical inefficiency. The second type is when the stochastic frontier model is augmented by input ratio equations in which allocative inefficiency is correlated with technical inefficiency. Compared to the standard kernel-smoothing estimator, a newer estimator based on a local linear random forest helps mitigate the curse of dimensionality when the conditioning set is large. Besides numerous simulations, there is an illustrative empirical example.
Details
Keywords
Heeyun Kim and Paula Clasing-Manquian
Education researchers have been urged to utilize causal inference methods to estimate the policy effect more rigorously. While randomized controlled trials (RCTs) are the gold…
Abstract
Education researchers have been urged to utilize causal inference methods to estimate the policy effect more rigorously. While randomized controlled trials (RCTs) are the gold standard for assessing causality, RCTs are infeasible in some educational settings, particularly when ethical concerns or high cost are involved. Quasi-experimental research designs are the best alternative approach to study educational topics not amenable to RCTs, as they mimic experimental conditions and use statistical techniques to reduce bias from variables omitted in the empirical models. In this chapter, we introduce and discuss the core concepts, applicability, and limitations of three quasi-experimental methods in higher education research (i.e., difference-in-differences, instrumental variables, and regression discontinuity). By introducing each of these techniques, we aim to expand the higher education researcher's toolbox and encourage the use of these quasi-experimental methods to evaluate educational interventions.
Details
Keywords
Luis Orea, Inmaculada Álvarez-Ayuso and Luis Servén
This chapter provides an empirical assessment of the effects of infrastructure provision on structural change and aggregate productivity using industrylevel data for a set of…
Abstract
This chapter provides an empirical assessment of the effects of infrastructure provision on structural change and aggregate productivity using industrylevel data for a set of developed and developing countries over 1995–2010. A distinctive feature of the empirical strategy followed is that it allows the measurement of the resource reallocation directly attributable to infrastructure provision. To achieve this, a two-level top-down decomposition of aggregate productivity that combines and extends several strands of the literature is proposed. The empirical application reveals significant production losses attributable to misallocation of inputs across firms, especially among African countries. Also, the results show that infrastructure provision has stimulated aggregate total factor productivity growth through both within and between industry productivity gains.
Details
Keywords
Moussa Sigue, Désiré Drabo, Soumaïla Woni, Gnanderman Sirpe and Aminata Ouedraogo
This paper aims to assess the short- and long-run effects of the interaction between institutional quality and financial development (FD) on the competitiveness of the WAEMU…
Abstract
Purpose
This paper aims to assess the short- and long-run effects of the interaction between institutional quality and financial development (FD) on the competitiveness of the WAEMU economy over the period 2007–2018.
Design/methodology/approach
The methodology consisted of cross-referencing a synthetic indicator of FD with indicators of institutional quality and then estimating an auto regressive distributed lag model.
Findings
The results of the pooled mean group and dynamic fixed effect estimation show a positive and significant impact of this interaction on the competitiveness of the economy in the long run. In the short run, the results are quite similar to those in the long run for the direct effects but different for the crosses. Also, the analysis of country specificity shows that the results are similar to those in the short run since the interaction between FD and institutional quality (political stability and government effectiveness) negatively affects the competitiveness of Burkina Faso, Ivory Coast and Mali, and positively affects the competitiveness of Benin and Senegal.
Social implications
These results suggest the need for effective policies to improve the quality of institutions to enhance the mobilization of financial resources through FD to ensure the competitiveness of economies. Improving the quality of the political and institutional environment is a prerequisite for economic competitiveness.
Originality/value
The paper is in line with the New Institutional Economics that developed in the 1970s. This referential framework is a heterogeneous body of work that encompasses works whose common point is the determination of the role of institutions in economic coordination. Unlike previous studies, which have focused on the contribution of the interaction between institutional quality variables and FD on economic growth, this paper analyzes the effects of this interaction on economic competitiveness. It, therefore, constitutes a contribution to this literature and aims primarily to fill this gap.
Details
Keywords
Hongming Gao, Hongwei Liu, Weizhen Lin and Chunfeng Chen
Purchase conversion prediction aims to improve user experience and convert visitors into real buyers to drive sales of firms; however, the total conversion rate is low, especially…
Abstract
Purpose
Purchase conversion prediction aims to improve user experience and convert visitors into real buyers to drive sales of firms; however, the total conversion rate is low, especially for e-retailers. To date, little is known about how e-retailers can scientifically detect users' intents within a purchase conversion funnel during their ongoing sessions and strategically optimize real-time marketing tactics corresponding to dynamic intent states. This study mainly aims to detect a real-time state of the conversion funnel based on graph theory, which refers to a five-class classification problem in the overt real-time choice decisions (RTCDs)—click, tag-to-wishlist, add-to-cart, remove-from-cart and purchase—during an ongoing session.
Design/methodology/approach
The authors propose a novel graph-theoretic framework to detect different states of the conversion funnel by identifying a user's unobserved mindset revealed from their navigation process graph, namely clickstream graph. First, the raw clickstream data are identified into individual sessions based on a 30-min time-out heuristic approach. Then, the authors convert each session into a sequence of temporal item-level clickstream graphs and conduct a temporal graph feature engineering according to the basic, single-, dyadic- and triadic-node and global characteristics. Furthermore, the synthetic minority oversampling technique is adopted to address with the problem of classifying imbalanced data. Finally, the authors train and test the proposed approach with several popular artificial intelligence algorithms.
Findings
The graph-theoretic approach validates that users' latent intent states within the conversion funnel can be interpreted as time-varying natures of their online graph footprints. In particular, the experimental results indicate that the graph-theoretic feature-oriented models achieve a substantial improvement of over 27% in line with the macro-average and micro-average area under the precision-recall curve, as compared to the conventional ones. In addition, the top five informative graph features for RTCDs are found to be Transitivity, Edge, Node, Degree and Reciprocity. In view of interpretability, the basic, single-, dyadic- and triadic-node and global characteristics of clickstream graphs have their specific advantages.
Practical implications
The findings suggest that the temporal graph-theoretic approach can form an efficient and powerful AI-based real-time intent detecting decision-support system. Different levels of graph features have their specific interpretability on RTCDs from the perspectives of consumer behavior and psychology, which provides a theoretical basis for the design of computer information systems and the optimization of the ongoing session intervention or recommendation in e-commerce.
Originality/value
To the best of the authors' knowledge, this is the first study to apply clickstream graphs and real-time decision choices in conversion prediction and detection. Most studies have only meditated on a binary classification problem, while this study applies a graph-theoretic approach in a five-class classification problem. In addition, this study constructs temporal item-level graphs to represent the original structure of clickstream session data based on graph theory. The time-varying characteristics of the proposed approach enhance the performance of purchase conversion detection during an ongoing session.
Details
Keywords
Nicolae Stef and Anthony Terriau
We investigate how firing notification procedures influence wage growth. Using a sample of 33 countries over the period 2006–2015, we show that administrative requirements in…
Abstract
We investigate how firing notification procedures influence wage growth. Using a sample of 33 countries over the period 2006–2015, we show that administrative requirements in cases of dismissal have a positive and significant effect on wage growth. The result is robust even after controlling for the endogeneity of the firing notification restrictions, the involvement of third parties in the wage bargaining process, the minimum wage, the firms' training policy, and the composition of employment. These findings suggest that firing notification procedures foster the growth of wages by increasing the bargaining power of incumbent workers.
Details
Keywords
Marcin Nowak, Marta Pawłowska-Nowak, Małgorzata Kokocińska and Piotr Kułyk
With the use of the grey incidence analysis (GIA), indicators such as the absolute degree of grey incidence (εij), relative degree of grey incidence (rij) or synthetic degree of…
Abstract
Purpose
With the use of the grey incidence analysis (GIA), indicators such as the absolute degree of grey incidence (εij), relative degree of grey incidence (rij) or synthetic degree of grey incidence (ρij) are calculated. However, it seems that some assumptions made to calculate them are arguable, which may also have a material impact on the reliability of test results. In this paper, the authors analyse one of the indicators of the GIA, namely the relative degree of grey incidence. The aim of the article was to verify the hypothesis: in determining the relative degree of grey incidence, the method of standardisation of elements in a series significantly affects the test results.
Design/methodology/approach
To achieve the purpose of the article, the authors used the numerical simulation method and the logical analysis method (in order to draw conclusions from our tests).
Findings
It turned out that the applied method of standardising elements in series when calculating the relative degree of grey incidence significantly affects the test results. Moreover, the manner of standardisation used in the original method (which involves dividing all elements by the first element) is not the best. Much more reliable results are obtained by a standardisation that involves dividing all elements by their arithmetic mean.
Research limitations/implications
Limitations of the conducted evaluation involve in particular the limited scope of inference. This is since the obtained results referred to only one of the indicators classified into the GIA.
Originality/value
In this article, the authors have evaluated the model of GIA in which the relative degree of grey incidence is determined. As a result of the research, the authors have proposed a recommendation regarding a change in the method of standardising variables, which will contribute to obtaining more reliable results in relational tests using the grey system theory.
Details
Keywords
Glenn W. Harrison and J. Todd Swarthout
We take Cumulative Prospect Theory (CPT) seriously by rigorously estimating structural models using the full set of CPT parameters. Much of the literature only estimates a subset…
Abstract
We take Cumulative Prospect Theory (CPT) seriously by rigorously estimating structural models using the full set of CPT parameters. Much of the literature only estimates a subset of CPT parameters, or more simply assumes CPT parameter values from prior studies. Our data are from laboratory experiments with undergraduate students and MBA students facing substantial real incentives and losses. We also estimate structural models from Expected Utility Theory (EUT), Dual Theory (DT), Rank-Dependent Utility (RDU), and Disappointment Aversion (DA) for comparison. Our major finding is that a majority of individuals in our sample locally asset integrate. That is, they see a loss frame for what it is, a frame, and behave as if they evaluate the net payment rather than the gross loss when one is presented to them. This finding is devastating to the direct application of CPT to these data for those subjects. Support for CPT is greater when losses are covered out of an earned endowment rather than house money, but RDU is still the best single characterization of individual and pooled choices. Defenders of the CPT model claim, correctly, that the CPT model exists “because the data says it should.” In other words, the CPT model was borne from a wide range of stylized facts culled from parts of the cognitive psychology literature. If one is to take the CPT model seriously and rigorously then it needs to do a much better job of explaining the data than we see here.
Details
Keywords
The use of economic sanctions has grown dramatically in recent decades. Nevertheless, many arguments are presented in the public policy space regarding their effects on target…
Abstract
Purpose
The use of economic sanctions has grown dramatically in recent decades. Nevertheless, many arguments are presented in the public policy space regarding their effects on target populations. The author presents the first systematic analysis of the effects of sanctions on living conditions in target countries.
Design/methodology/approach
This paper provides a comprehensive survey and assessment of the literature on the effects of economic sanctions on living standards in target countries. The author identifies 31 studies that apply quantitative econometric or calibration methods to cross-country and national data to assess the impact of economic sanctions on indicators of human and economic development. The author provides in-depth discussions of three sanctions episodes—Iran, Afghanistan and Venezuela—that illustrate the channels through which sanctions affect living conditions in target countries.
Findings
Of the 31 studies, 30 find that sanctions have negative effects on outcomes ranging from per capita income to poverty, inequality, mortality and human rights. The author provides new results showing that 54 countries—27% of all countries and 29% of the world economy— are sanctioned today, up from only 4% of countries in the 1960s. In the three cases discussed, sanctions that restricted the access of governments to foreign exchange limited the ability of states to provide essential public goods and services and generated substantial negative spillovers on private sector and nongovernmental actors.
Originality/value
This is the first literature survey that systematically assesses the quantitative evidence on the effect of sanctions on living conditions in target countries.
Details