Search results
1 – 10 of 311
Enrique Martínez-García, Diego Vilán and Mark A. Wynne
Open-Economy models are central to the discussion of the trade-offs monetary policy faces in an increasingly more globalized world (e.g., Marínez-García & Wynne, 2010), but…
Abstract
Open-Economy models are central to the discussion of the trade-offs monetary policy faces in an increasingly more globalized world (e.g., Marínez-García & Wynne, 2010), but bringing them to the data is not without its challenges. Controlling for misspecification bias, we trace the problem of uncertainty surrounding structural parameter estimation in the context of a fully specified New Open Economy Macro (NOEM) model partly to sample size. We suggest that standard macroeconomic time series with a coverage of less than forty years may not be informative enough for some parameters of interest to be recovered with precision. We also illustrate how uncertainty also arises from weak structural identification, irrespective of the sample size. This remains a concern for empirical research and we recommend estimation with simulated observations before using actual data as a way of detecting structural parameters that are prone to weak identification. We also recommend careful evaluation and documentation of the implementation strategy (specially in the selection of observables) as it can have significant effects on the strength of identification of key model parameters.
Details
Keywords
I survey applications of Markov switching models to the asset pricing and portfolio choice literatures. In particular, I discuss the potential that Markov switching models have to…
Abstract
I survey applications of Markov switching models to the asset pricing and portfolio choice literatures. In particular, I discuss the potential that Markov switching models have to fit financial time series and at the same time provide powerful tools to test hypotheses formulated in the light of financial theories, and to generate positive economic value, as measured by risk-adjusted performances, in dynamic asset allocation applications. The chapter also reviews the role of Markov switching dynamics in modern asset pricing models in which the no-arbitrage principle is used to characterize the properties of the fundamental pricing measure in the presence of regimes.
Details
Keywords
This chapter analyzes the empirical relationship between the pricesetting/consumption behavior and the sources of persistence in inflation and output. First, a small-scale…
Abstract
This chapter analyzes the empirical relationship between the pricesetting/consumption behavior and the sources of persistence in inflation and output. First, a small-scale New-Keynesian model (NKM) is examined using the method of moment and maximum likelihood estimators with US data from 1960 to 2007. Then a formal test is used to compare the fit of two competing specifications in the New-Keynesian Phillips Curve (NKPC) and the IS equation, that is, backward- and forward-looking behavior. Accordingly, the inclusion of a lagged term in the NKPC and the IS equation improves the fit of the model while offsetting the influence of inherited and extrinsic persistence; it is shown that intrinsic persistence plays a major role in approximating inflation and output dynamics for the Great Inflation period. However, the null hypothesis cannot be rejected at the 5% level for the Great Moderation period, that is, the NKM with purely forward-looking behavior and its hybrid variant are equivalent. Monte Carlo experiments investigate the validity of chosen moment conditions and the finite sample properties of the chosen estimation methods. Finally, the empirical performance of the formal test is discussed along the lines of the Akaike's and the Bayesian information criterion.
Details
Keywords
Stan Aungst, Russell R. Barton and David T. Wilson
Quality Function Deployment (QFD) proposes to take into account the “voice of the customer,” through a list of customer needs, which are (qualitatively) mapped to technical…
Abstract
Quality Function Deployment (QFD) proposes to take into account the “voice of the customer,” through a list of customer needs, which are (qualitatively) mapped to technical requirements in House One. But customers do not perceive products in this space, nor do they not make purchase decisions in this space. Marketing specialists use statistical models to map between a simpler space of customer perceptions and the long and detailed list of needs. For automobiles, for example, the main axes in perceptual space might be categories such as luxury, performance, sport, and utility. A product’s position on these few axes determines the detailed customer requirements consistent with the automobiles’ position such as interior volume, gauges and accessories, seating type, fuel economy, door height, horsepower, interior noise level, seating capacity, paint colors, trim, and so forth. Statistical models such as factor analysis and principal components analysis are used to describe the mapping between these spaces, which we call House Zero.
This paper focus on House One. Two important steps of the product development process using House One are: (1) setting technical targets; (2) identifying the inherent tradeoffs in a design including a position of merit. Utility functions are used to determine feature preferences for a product. Conjoint analysis is used to capture the product preference and potential market share. Linear interpolation and the slope point formula are used to determine other points of customer needs. This research draws from the formal mapping concepts developed by Nam Suh and the qualitative maps of quality function deployment, to present unified information and mapping paradigm for concurrent product/process design. This approach is the virtual integrated design method that is tested upon data from a business design problem.
Throughout the 1990s, the supply of new condominiums in Tokyo significantly increased while prices persistently fell. This article investigates whether the market power of…
Abstract
Throughout the 1990s, the supply of new condominiums in Tokyo significantly increased while prices persistently fell. This article investigates whether the market power of condominium developers is a factor in explaining the outcome in this market and whether there is a relationship between production cost trend and the degree of market power that the developers were able to exercise. In order to respond to these questions, we construct and structurally estimate a dynamic durable goods oligopoly model of the condominium market – one incorporating time-variant costs and a secondary market – using a nested GMM procedure. We find that the data provide no evidence that firms in the primary market have substantial market power in this industry. Moreover, the counterfactual experiment provides evidence that inflationary and deflationary expectations on production cost trends have asymmetric effects to the market power of condominium producers. The increase in their markup when cost inflation is anticipated is significantly higher than the decrease in the markup when the same magnitude of cost deflation is anticipated.
Details