Search results
1 – 10 of over 1000Cleomar Gomes da Silva and Fábio Augusto Reis Gomes
The purpose of this paper is to contribute to the teaching of undergraduate macroeconomics.
Abstract
Purpose
The purpose of this paper is to contribute to the teaching of undergraduate macroeconomics.
Design/methodology/approach
To suggest a roadmap, based on a consumption function, to be used by instructors willing to teach the Lucas Critique subject.
Findings
Therefore, this paper proposes a lesson, which consists of three parts, to help undergraduates better understand the subject: (1) a grading exercise to bring the topic closer to students’ lives; (2) a Keynesian and an optimal consumption function, followed by an example based on an unemployment insurance policy; and (3) two optional topics consisting of extensions of the optimal consumption function and some empirical results related to the Lucas Critique.
Originality/value
The Lucas Critique influenced the evolution of research in macroeconomics, but it is not easily grasped in a classroom.
Details
Keywords
This paper aims to study the design of bank capital regulation and points out a conceptual downside of risk-sensitive regulation. The author argues that when a bank is better…
Abstract
Purpose
This paper aims to study the design of bank capital regulation and points out a conceptual downside of risk-sensitive regulation. The author argues that when a bank is better informed about its risk than the regulator, designing regulation is subject to the Lucas critique. The second-best regulation could be risk-insensitive, which provides an explanation for the leverage ratio as a backstop to risk-based capital requirements. This paper offers empirical predictions and implications for policy.
Design/methodology/approach
The argument in the paper is based on analytical results from mechanism design.
Findings
Optimal bank regulation could be risk-insensitive, as is observed in practice in the form of the leverage ratio rule.
Originality/value
Counter to conventional wisdom, the paper argues and provides a new explanation for why bank regulation should not be sensitive to the risk of the bank. The paper then offers empirical predictions and implications for policy.
Details
Keywords
Aims to explore the possibility of developing a neoclassical theoryof institutional failure, based on “transaction costs”.Critically assesses the role of institutions in General…
Abstract
Aims to explore the possibility of developing a neoclassical theory of institutional failure, based on “transaction costs”. Critically assesses the role of institutions in General Equilibrium theory and concludes that, with the exception of the market (price mechanism), this is institution‐free. This is unsatisfactory, given the importance of the firm and the state, in particular, which have received wide attention recently in the theory of transaction costs. It is claimed that General Equilibrium theory can be given microfoundations based on transaction costs. This provides the possibility of a neo‐classical theory of institutional failure. It also has important implications on the nature and scope of economic theory in general and the plan versus markets debate in particular.
Details
Keywords
The purpose of this paper is to describe the transformation of macro‐modelling from reduced form behavioural equations estimated separately, through to contemporary microfounded…
Abstract
Purpose
The purpose of this paper is to describe the transformation of macro‐modelling from reduced form behavioural equations estimated separately, through to contemporary microfounded dynamic stochastic general equilibrium (DSGE) models estimated by systems methods. It is argued that estimated DSGE models should be seen as probability models that can be used as a laboratory for assessing new policies in a new and uncertain environment. The methodology is particularly relevant for emerging economies such as India.
Design/methodology/approach
This paper has analytical, empirical and policy dimensions. Estimating DSGE models by Bayesian‐Maximum‐Likelihood methods results in a posterior distribution of parameters that quantifies the uncertainty facing the policymaker. This, in turn, can be used for robust policy design.
Findings
The paper reviews evidence that inflation targeting in emerging economies welfare‐dominates exchange rate targeting.
Originality/value
This lies in the papers reviewed including those involving the author.
Details
Keywords
The purpose of this chapter is to study the mathematisation of finance – excessive use of mathematical models in finance – which has been widely blamed for the recent financial…
Abstract
The purpose of this chapter is to study the mathematisation of finance – excessive use of mathematical models in finance – which has been widely blamed for the recent financial and economic crisis. We argue that the problem might actually be the financialisation of mathematics, as evidenced by the gradual embedding of branches of mathematics into financial economics. The concept of embeddedness, originally proposed by Polanyi, is relevant to describe the sociological relationship between fields of knowledge. After exploring the relationship between mathematics, finance and economics since antiquity, we find that theoretical developments in the 1950s and 1970s lead directly to this embedding. The key implication of our findings is the realization that it has become necessary to disembed mathematics from finance and economics, and proposes a number of partial steps to facilitate this process. This chapter contributes to the debate on the mathematisation of finance by uniquely combining a historical approach, which chronicles the evolution of the relation between mathematics and finance, with a sociological approach from the perspective of Polyani’s concept of embedding.
Details
Keywords
Giuseppe Pernagallo and Benedetto Torrisi
In the era of big data investors deal every day with a huge flow of information. Given a model populated by economic agents with limited computational capacity, the paper shows…
Abstract
Purpose
In the era of big data investors deal every day with a huge flow of information. Given a model populated by economic agents with limited computational capacity, the paper shows how “too much” information could cause financial markets to depart from the assumption of informational efficiency. The purpose of the paper is to show that as information increases, at some point the efficient market hypothesis ceases to be true. In general, the hypothesis cannot be maintained if the use of the maximum amount of information is not optimal for investors.
Design/methodology/approach
The authors use a model of cognitive heterogeneity to show the inadequacy of the notion of market efficiency in the modern society of big data.
Findings
Theorem 1 proves that as information grows, agents' processing capacities do not, so at some point there will be an amount of information that no one can fully use. The introduction of computer-based processing techniques can restore efficiency, however, also machines are bounded. This means that as the amount of information increases, even in the presence of non-human techniques, at some point it will no longer be possible to process further information.
Practical implications
This paper explains why investors very often prefer heuristics to complex strategies.
Originality/value
This is, to the authors’ knowledge, the first model that uses information overload to prove informational inefficiency. This paper links big data to informational efficiency, whereas Theorem 1 proves that the old notion of efficiency is not well-founded because it relies on unlimited processing capacities of economic agents.
Details
Keywords
Rosaria Rita Canale and Rajmund Mirdala
This chapter is devoted to fiscal policy theory and to how its evolution influenced the policy principles implemented from the end of the World War II to the present. It shows how…
Abstract
This chapter is devoted to fiscal policy theory and to how its evolution influenced the policy principles implemented from the end of the World War II to the present. It shows how the theoretical foundations evolved, from the Keynesian theory according to which public expenditure was conceived as an instrument to sustain aggregate demand and achieve full employment, to the present theoretical framework in which, following the intertemporal approach, it has been downgraded to an external shock. The public debt issue is examined with the aim of explaining why sound public finance represents a primary policy objective in the Eurozone.
Details
Keywords