Search results

1 – 10 of 139
Book part
Publication date: 23 October 2023

Glenn W. Harrison and J. Todd Swarthout

We take Cumulative Prospect Theory (CPT) seriously by rigorously estimating structural models using the full set of CPT parameters. Much of the literature only estimates a subset…

Abstract

We take Cumulative Prospect Theory (CPT) seriously by rigorously estimating structural models using the full set of CPT parameters. Much of the literature only estimates a subset of CPT parameters, or more simply assumes CPT parameter values from prior studies. Our data are from laboratory experiments with undergraduate students and MBA students facing substantial real incentives and losses. We also estimate structural models from Expected Utility Theory (EUT), Dual Theory (DT), Rank-Dependent Utility (RDU), and Disappointment Aversion (DA) for comparison. Our major finding is that a majority of individuals in our sample locally asset integrate. That is, they see a loss frame for what it is, a frame, and behave as if they evaluate the net payment rather than the gross loss when one is presented to them. This finding is devastating to the direct application of CPT to these data for those subjects. Support for CPT is greater when losses are covered out of an earned endowment rather than house money, but RDU is still the best single characterization of individual and pooled choices. Defenders of the CPT model claim, correctly, that the CPT model exists “because the data says it should.” In other words, the CPT model was borne from a wide range of stylized facts culled from parts of the cognitive psychology literature. If one is to take the CPT model seriously and rigorously then it needs to do a much better job of explaining the data than we see here.

Details

Models of Risk Preferences: Descriptive and Normative Challenges
Type: Book
ISBN: 978-1-83797-269-2

Keywords

Book part
Publication date: 23 October 2023

Nathaniel T. Wilcox

The author presents new estimates of the probability weighting functions found in rank-dependent theories of choice under risk. These estimates are unusual in two senses. First…

Abstract

The author presents new estimates of the probability weighting functions found in rank-dependent theories of choice under risk. These estimates are unusual in two senses. First, they are free of functional form assumptions about both utility and weighting functions, and they are entirely based on binary discrete choices and not on matching or valuation tasks, though they depend on assumptions concerning the nature of probabilistic choice under risk. Second, estimated weighting functions contradict widely held priors of an inverse-s shape with fixed point well in the interior of the (0,1) interval: Instead the author usually finds populations dominated by “optimists” who uniformly overweight best outcomes in risky options. The choice pairs used here mostly do not provoke similarity-based simplifications. In a third experiment, the author shows that the presence of choice pairs that provoke similarity-based computational shortcuts does indeed flatten estimated probability weighting functions.

Details

Models of Risk Preferences: Descriptive and Normative Challenges
Type: Book
ISBN: 978-1-83797-269-2

Keywords

Book part
Publication date: 5 April 2024

Christine Amsler, Robert James, Artem Prokhorov and Peter Schmidt

The traditional predictor of technical inefficiency proposed by Jondrow, Lovell, Materov, and Schmidt (1982) is a conditional expectation. This chapter explores whether, and by…

Abstract

The traditional predictor of technical inefficiency proposed by Jondrow, Lovell, Materov, and Schmidt (1982) is a conditional expectation. This chapter explores whether, and by how much, the predictor can be improved by using auxiliary information in the conditioning set. It considers two types of stochastic frontier models. The first type is a panel data model where composed errors from past and future time periods contain information about contemporaneous technical inefficiency. The second type is when the stochastic frontier model is augmented by input ratio equations in which allocative inefficiency is correlated with technical inefficiency. Compared to the standard kernel-smoothing estimator, a newer estimator based on a local linear random forest helps mitigate the curse of dimensionality when the conditioning set is large. Besides numerous simulations, there is an illustrative empirical example.

Book part
Publication date: 5 April 2024

Taining Wang and Daniel J. Henderson

A semiparametric stochastic frontier model is proposed for panel data, incorporating several flexible features. First, a constant elasticity of substitution (CES) production…

Abstract

A semiparametric stochastic frontier model is proposed for panel data, incorporating several flexible features. First, a constant elasticity of substitution (CES) production frontier is considered without log-transformation to prevent induced non-negligible estimation bias. Second, the model flexibility is improved via semiparameterization, where the technology is an unknown function of a set of environment variables. The technology function accounts for latent heterogeneity across individual units, which can be freely correlated with inputs, environment variables, and/or inefficiency determinants. Furthermore, the technology function incorporates a single-index structure to circumvent the curse of dimensionality. Third, distributional assumptions are eschewed on both stochastic noise and inefficiency for model identification. Instead, only the conditional mean of the inefficiency is assumed, which depends on related determinants with a wide range of choice, via a positive parametric function. As a result, technical efficiency is constructed without relying on an assumed distribution on composite error. The model provides flexible structures on both the production frontier and inefficiency, thereby alleviating the risk of model misspecification in production and efficiency analysis. The estimator involves a series based nonlinear least squares estimation for the unknown parameters and a kernel based local estimation for the technology function. Promising finite-sample performance is demonstrated through simulations, and the model is applied to investigate productive efficiency among OECD countries from 1970–2019.

Book part
Publication date: 5 April 2024

Hung-pin Lai

The standard method to estimate a stochastic frontier (SF) model is the maximum likelihood (ML) approach with the distribution assumptions of a symmetric two-sided stochastic…

Abstract

The standard method to estimate a stochastic frontier (SF) model is the maximum likelihood (ML) approach with the distribution assumptions of a symmetric two-sided stochastic error v and a one-sided inefficiency random component u. When v or u has a nonstandard distribution, such as v follows a generalized t distribution or u has a χ2 distribution, the likelihood function can be complicated or untractable. This chapter introduces using indirect inference to estimate the SF models, where only least squares estimation is used. There is no need to derive the density or likelihood function, thus it is easier to handle a model with complicated distributions in practice. The author examines the finite sample performance of the proposed estimator and also compare it with the standard ML estimator as well as the maximum simulated likelihood (MSL) estimator using Monte Carlo simulations. The author found that the indirect inference estimator performs quite well in finite samples.

Book part
Publication date: 23 October 2023

Morten I. Lau, Hong Il Yoo and Hongming Zhao

We evaluate the hypothesis of temporal stability in risk preferences using two recent data sets from longitudinal lab experiments. Both experiments included a combination of…

Abstract

We evaluate the hypothesis of temporal stability in risk preferences using two recent data sets from longitudinal lab experiments. Both experiments included a combination of decision tasks that allows one to identify a full set of structural parameters characterizing risk preferences under Cumulative Prospect Theory (CPT), including loss aversion. We consider temporal stability in those structural parameters at both population and individual levels. The population-level stability pertains to whether the distribution of risk preferences across individuals in the subject population remains stable over time. The individual-level stability pertains to within-individual correlation in risk preferences over time. We embed the CPT structure in a random coefficient model that allows us to evaluate temporal stability at both levels in a coherent manner, without having to switch between different sets of models to draw inferences at a specific level.

Details

Models of Risk Preferences: Descriptive and Normative Challenges
Type: Book
ISBN: 978-1-83797-269-2

Keywords

Book part
Publication date: 9 November 2023

Anna Szelągowska and Ilona Skibińska-Fabrowska

The monetary policy implementation and corporate investment are closely intertwined. The aim of modern monetary policy is to mitigate economic fluctuations and stabilise economic…

Abstract

Research Background

The monetary policy implementation and corporate investment are closely intertwined. The aim of modern monetary policy is to mitigate economic fluctuations and stabilise economic growth. One of the ways of influencing the real economy is influencing the level of investment by enterprises.

Purpose of the Chapter

This chapter provides evidence on how monetary policy affected corporate investment in Poland between 1Q 2000 and 3Q 2022. We investigate the impact of Polish monetary policy on investment outlays in contexts of high uncertainty.

Methodology

Using the correlation analysis and the regression model, we show the relation between the monetary policy and the investment outlays of Polish enterprises. We used the least squares method as the most popular in linear model estimation. The evaluation includes model fit, independent variable significance and random component, i.e. constancy of variance, autocorrelation, alignment with normal distribution, along with Fisher–Snedecor test and Breusch–Pagan test.

Findings

We find that Polish enterprises are responsive to changes in monetary policy. Hence, the corporate investment level is correlated with the effects of monetary policy (especially with the decision on the central bank's basic interest rate changes). We found evidence that QE policy has a positive impact on Polish investment outlays. The corporate investment in Poland is positively affected by respective monetary policies through Narodowy Bank Polski (NBP) reference rate, inflation, corporate loans, weighted average interest rate on corporate loans.

Details

Modeling Economic Growth in Contemporary Poland
Type: Book
ISBN: 978-1-83753-655-9

Keywords

Book part
Publication date: 18 January 2024

Ackmez Mudhoo, Gaurav Sharma, Khim Hoong Chu and Mika Sillanpää

Adsorption parameters (e.g. Langmuir constant, mass transfer coefficient and Thomas rate constant) are involved in the design of aqueous-media adsorption treatment units. However…

Abstract

Adsorption parameters (e.g. Langmuir constant, mass transfer coefficient and Thomas rate constant) are involved in the design of aqueous-media adsorption treatment units. However, the classic approach to estimating such parameters is perceived to be imprecise. Herein, the essential features and performances of the ant colony, bee colony and elephant herd optimisation approaches are introduced to the experimental chemist and chemical engineer engaged in adsorption research for aqueous systems. Key research and development directions, believed to harness these algorithms for real-scale water treatment (which falls within the wide-ranging coverage of the Sustainable Development Goal 6 (SDG 6) ‘Clean Water and Sanitation for All’), are also proposed. The ant colony, bee colony and elephant herd optimisations have higher precision and accuracy, and are particularly efficient in finding the global optimum solution. It is hoped that the discussions can stimulate both the experimental chemist and chemical engineer to delineate the progress achieved so far and collaborate further to devise strategies for integrating these intelligent optimisations in the design and operation of real multicomponent multi-complexity adsorption systems for water purification.

Details

Artificial Intelligence, Engineering Systems and Sustainable Development
Type: Book
ISBN: 978-1-83753-540-8

Keywords

Book part
Publication date: 27 June 2023

Richa Srivastava and M A Sanjeev

Several inferential procedures are advocated in the literature. The most commonly used techniques are the frequentist and the Bayesian inferential procedures. Bayesian methods…

Abstract

Several inferential procedures are advocated in the literature. The most commonly used techniques are the frequentist and the Bayesian inferential procedures. Bayesian methods afford inferences based on small data sets and are especially useful in studies with limited data availability. Bayesian approaches also help incorporate prior knowledge, especially subjective knowledge, into predictions. Considering the increasing difficulty in data acquisition, the application of Bayesian techniques can be hugely beneficial to managers, especially in analysing limited data situations like a study of expert opinion. Another factor constraining the broader application of Bayesian statistics in business was computational power requirements and the availability of appropriate analytical tools. However, with the increase in computational power, connectivity and the development of appropriate software programmes, Bayesian applications have become more attractive. This chapter attempts to unravel the applications of the Bayesian inferential procedure in marketing management.

Book part
Publication date: 20 March 2024

Reetika Dadheech and Dhiraj Sharma

Purpose: Preserving a country’s culture is crucial for its sustainability. Handicraft is a key draw for tourism destinations; it protects any civilisation’s indigenous knowledge…

Abstract

Purpose: Preserving a country’s culture is crucial for its sustainability. Handicraft is a key draw for tourism destinations; it protects any civilisation’s indigenous knowledge and culture by managing the historical, economic, and ecological ecosystems and perfectly aligns with sustainable development. It has a significant role in creating employment, especially in rural regions and is an essential contributor to the export economy, mainly in developing nations. The study focuses on the skills required and existing gaps in the handicraft industry, its development and prospects by considering women and their role in preserving and embodying the traditional art of making handicrafts.

Approach: A framework has been developed for mapping and analysing the skills required in the handicraft sector using econometric modelling; an enormous number of skills have been crowdsourced from the respondents, and machine learning techniques have been used.

Findings: The findings of the study revealed that employment in this area is dependent not only on general or specialised skills but also on complex matrix skills ranging from punctuality to working in unclean and unsafe environments, along with a set of personal qualities, such as taking initiatives and specific skills, for example polishing and colour coding.

Implications: The skills mapping technique utilised in this study is applicable globally, particularly for women indulged in casual work in developing nations’ handicrafts industry. The sustainable development goals, tourism, and handicrafts are all interconnected. The research includes understanding skills mapping, which provides insights into efficient job matching by incorporating preferences and studying the demand side of casual working by women in the handicraft sector from a skills perspective.

Details

Contemporary Challenges in Social Science Management: Skills Gaps and Shortages in the Labour Market
Type: Book
ISBN: 978-1-83753-165-3

Keywords

Access

Year

Last 12 months (139)

Content type

Book part (139)
1 – 10 of 139