Search results

1 – 10 of over 1000
Open Access
Article
Publication date: 30 June 2022

Cleomar Gomes da Silva and Fábio Augusto Reis Gomes

The purpose of this paper is to contribute to the teaching of undergraduate macroeconomics.

1199

Abstract

Purpose

The purpose of this paper is to contribute to the teaching of undergraduate macroeconomics.

Design/methodology/approach

To suggest a roadmap, based on a consumption function, to be used by instructors willing to teach the Lucas Critique subject.

Findings

Therefore, this paper proposes a lesson, which consists of three parts, to help undergraduates better understand the subject: (1) a grading exercise to bring the topic closer to students’ lives; (2) a Keynesian and an optimal consumption function, followed by an example based on an unemployment insurance policy; and (3) two optional topics consisting of extensions of the optimal consumption function and some empirical results related to the Lucas Critique.

Originality/value

The Lucas Critique influenced the evolution of research in macroeconomics, but it is not easily grasped in a classroom.

Details

EconomiA, vol. 23 no. 1
Type: Research Article
ISSN: 1517-7580

Keywords

Abstract

Details

New Directions in Macromodelling
Type: Book
ISBN: 978-1-84950-830-8

Article
Publication date: 16 July 2018

Alexander Bleck

This paper aims to study the design of bank capital regulation and points out a conceptual downside of risk-sensitive regulation. The author argues that when a bank is better…

Abstract

Purpose

This paper aims to study the design of bank capital regulation and points out a conceptual downside of risk-sensitive regulation. The author argues that when a bank is better informed about its risk than the regulator, designing regulation is subject to the Lucas critique. The second-best regulation could be risk-insensitive, which provides an explanation for the leverage ratio as a backstop to risk-based capital requirements. This paper offers empirical predictions and implications for policy.

Design/methodology/approach

The argument in the paper is based on analytical results from mechanism design.

Findings

Optimal bank regulation could be risk-insensitive, as is observed in practice in the form of the leverage ratio rule.

Originality/value

Counter to conventional wisdom, the paper argues and provides a new explanation for why bank regulation should not be sensitive to the risk of the bank. The paper then offers empirical predictions and implications for policy.

Article
Publication date: 1 January 1992

Christos Pitelis

Aims to explore the possibility of developing a neoclassical theoryof institutional failure, based on “transaction costs”.Critically assesses the role of institutions in General…

Abstract

Aims to explore the possibility of developing a neoclassical theory of institutional failure, based on “transaction costs”. Critically assesses the role of institutions in General Equilibrium theory and concludes that, with the exception of the market (price mechanism), this is institution‐free. This is unsatisfactory, given the importance of the firm and the state, in particular, which have received wide attention recently in the theory of transaction costs. It is claimed that General Equilibrium theory can be given microfoundations based on transaction costs. This provides the possibility of a neo‐classical theory of institutional failure. It also has important implications on the nature and scope of economic theory in general and the plan versus markets debate in particular.

Details

Journal of Economic Studies, vol. 19 no. 1
Type: Research Article
ISSN: 0144-3585

Keywords

Content available
Book part
Publication date: 4 September 2023

Stephen E. Spear and Warren Young

Abstract

Details

Overlapping Generations: Methods, Models and Morphology
Type: Book
ISBN: 978-1-83753-052-6

Article
Publication date: 13 April 2012

Paul Levine

The purpose of this paper is to describe the transformation of macro‐modelling from reduced form behavioural equations estimated separately, through to contemporary microfounded…

Abstract

Purpose

The purpose of this paper is to describe the transformation of macro‐modelling from reduced form behavioural equations estimated separately, through to contemporary microfounded dynamic stochastic general equilibrium (DSGE) models estimated by systems methods. It is argued that estimated DSGE models should be seen as probability models that can be used as a laboratory for assessing new policies in a new and uncertain environment. The methodology is particularly relevant for emerging economies such as India.

Design/methodology/approach

This paper has analytical, empirical and policy dimensions. Estimating DSGE models by Bayesian‐Maximum‐Likelihood methods results in a posterior distribution of parameters that quantifies the uncertainty facing the policymaker. This, in turn, can be used for robust policy design.

Findings

The paper reviews evidence that inflation targeting in emerging economies welfare‐dominates exchange rate targeting.

Originality/value

This lies in the papers reviewed including those involving the author.

Book part
Publication date: 16 December 2016

Sébastien Lleo and Jessica Li

The purpose of this chapter is to study the mathematisation of finance – excessive use of mathematical models in finance – which has been widely blamed for the recent financial…

Abstract

The purpose of this chapter is to study the mathematisation of finance – excessive use of mathematical models in finance – which has been widely blamed for the recent financial and economic crisis. We argue that the problem might actually be the financialisation of mathematics, as evidenced by the gradual embedding of branches of mathematics into financial economics. The concept of embeddedness, originally proposed by Polanyi, is relevant to describe the sociological relationship between fields of knowledge. After exploring the relationship between mathematics, finance and economics since antiquity, we find that theoretical developments in the 1950s and 1970s lead directly to this embedding. The key implication of our findings is the realization that it has become necessary to disembed mathematics from finance and economics, and proposes a number of partial steps to facilitate this process. This chapter contributes to the debate on the mathematisation of finance by uniquely combining a historical approach, which chronicles the evolution of the relation between mathematics and finance, with a sociological approach from the perspective of Polyani’s concept of embedding.

Details

Finance and Economy for Society: Integrating Sustainability
Type: Book
ISBN: 978-1-78635-509-6

Keywords

Article
Publication date: 23 October 2020

Giuseppe Pernagallo and Benedetto Torrisi

In the era of big data investors deal every day with a huge flow of information. Given a model populated by economic agents with limited computational capacity, the paper shows…

Abstract

Purpose

In the era of big data investors deal every day with a huge flow of information. Given a model populated by economic agents with limited computational capacity, the paper shows how “too much” information could cause financial markets to depart from the assumption of informational efficiency. The purpose of the paper is to show that as information increases, at some point the efficient market hypothesis ceases to be true. In general, the hypothesis cannot be maintained if the use of the maximum amount of information is not optimal for investors.

Design/methodology/approach

The authors use a model of cognitive heterogeneity to show the inadequacy of the notion of market efficiency in the modern society of big data.

Findings

Theorem 1 proves that as information grows, agents' processing capacities do not, so at some point there will be an amount of information that no one can fully use. The introduction of computer-based processing techniques can restore efficiency, however, also machines are bounded. This means that as the amount of information increases, even in the presence of non-human techniques, at some point it will no longer be possible to process further information.

Practical implications

This paper explains why investors very often prefer heuristics to complex strategies.

Originality/value

This is, to the authors’ knowledge, the first model that uses information overload to prove informational inefficiency. This paper links big data to informational efficiency, whereas Theorem 1 proves that the old notion of efficiency is not well-founded because it relies on unlimited processing capacities of economic agents.

Details

Review of Behavioral Finance, vol. 14 no. 2
Type: Research Article
ISSN: 1940-5979

Keywords

Book part
Publication date: 13 May 2019

Rosaria Rita Canale and Rajmund Mirdala

This chapter is devoted to fiscal policy theory and to how its evolution influenced the policy principles implemented from the end of the World War II to the present. It shows how…

Abstract

This chapter is devoted to fiscal policy theory and to how its evolution influenced the policy principles implemented from the end of the World War II to the present. It shows how the theoretical foundations evolved, from the Keynesian theory according to which public expenditure was conceived as an instrument to sustain aggregate demand and achieve full employment, to the present theoretical framework in which, following the intertemporal approach, it has been downgraded to an external shock. The public debt issue is examined with the aim of explaining why sound public finance represents a primary policy objective in the Eurozone.

Details

Fiscal and Monetary Policy in the Eurozone: Theoretical Concepts and Empirical Evidence
Type: Book
ISBN: 978-1-78743-793-7

Keywords

Abstract

Details

Optimal Growth Economics: An Investigation of the Contemporary Issues and the Prospect for Sustainable Growth
Type: Book
ISBN: 978-0-44450-860-7

1 – 10 of over 1000