Search results

1 – 10 of over 15000
Book part
Publication date: 1 January 2008

Gary Koop

Equilibrium job search models allow for labor markets with homogeneous workers and firms to yield nondegenerate wage densities. However, the resulting wage densities do not accord…

Abstract

Equilibrium job search models allow for labor markets with homogeneous workers and firms to yield nondegenerate wage densities. However, the resulting wage densities do not accord well with empirical regularities. Accordingly, many extensions to the basic equilibrium search model have been considered (e.g., heterogeneity in productivity, heterogeneity in the value of leisure, etc.). It is increasingly common to use nonparametric forms for these extensions and, hence, researchers can obtain a perfect fit (in a kernel smoothed sense) between theoretical and empirical wage densities. This makes it difficult to carry out model comparison of different model extensions. In this paper, we first develop Bayesian parametric and nonparametric methods which are comparable to the existing non-Bayesian literature. We then show how Bayesian methods can be used to compare various nonparametric equilibrium search models in a statistically rigorous sense.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

Book part
Publication date: 2 December 2003

Kenji Wada

I will investigate the short-term and the long-term characteristics of Japanese daily overnight call rate between 1985 and 1999 and compare it with the U.S. federal funds rate…

Abstract

I will investigate the short-term and the long-term characteristics of Japanese daily overnight call rate between 1985 and 1999 and compare it with the U.S. federal funds rate during the same period. Such long-term data for the former has not been utilized in the previous studies. When we compare the short-term characteristics of these two rates with the corresponding long-term ones, those are found to be different. When we compare these two rates in the short-term as well as in the long-term, the long-term characteristics are found to be different, even though the short-term characteristics are similar in some sub-sample periods.

Details

The Japanese Finance: Corporate Finance and Capital Markets in ...
Type: Book
ISBN: 978-1-84950-246-7

Book part
Publication date: 16 December 2009

Chinman Chui and Ximing Wu

Knowledge of the dependence structure between financial assets is crucial to improve the performance in financial risk management. It is known that the copula completely…

Abstract

Knowledge of the dependence structure between financial assets is crucial to improve the performance in financial risk management. It is known that the copula completely summarizes the dependence structure among multiple variables. We propose a multivariate exponential series estimator (ESE) to estimate copula densities nonparametrically. The ESE has an appealing information-theoretic interpretation and attains the optimal rate of convergence for nonparametric density estimations in Stone (1982). More importantly, it overcomes the boundary bias of conventional nonparametric copula estimators. Our extensive Monte Carlo studies show the proposed estimator outperforms the kernel and the log-spline estimators in copula estimation. It also demonstrates that two-step density estimation through an ESE copula often outperforms direct estimation of joint densities. Finally, the ESE copula provides superior estimates of tail dependence compared to the empirical tail index coefficient. An empirical examination of the Asian financial markets using the proposed method is provided.

Details

Nonparametric Econometric Methods
Type: Book
ISBN: 978-1-84950-624-3

Book part
Publication date: 30 November 2011

Massimo Guidolin

I review the burgeoning literature on applications of Markov regime switching models in empirical finance. In particular, distinct attention is devoted to the ability of Markov…

Abstract

I review the burgeoning literature on applications of Markov regime switching models in empirical finance. In particular, distinct attention is devoted to the ability of Markov Switching models to fit the data, filter unknown regimes and states on the basis of the data, to allow a powerful tool to test hypotheses formulated in light of financial theories, and to their forecasting performance with reference to both point and density predictions. The review covers papers concerning a multiplicity of sub-fields in financial economics, ranging from empirical analyses of stock returns, the term structure of default-free interest rates, the dynamics of exchange rates, as well as the joint process of stock and bond returns.

Details

Missing Data Methods: Time-Series Methods and Applications
Type: Book
ISBN: 978-1-78052-526-6

Keywords

Article
Publication date: 20 November 2023

Reddy K. Prasanth Kumar, Nageswara Rao Boggarapu and S.V.S. Narayana Murty

This paper adopts a modified Taguchi approach to develop empirical relationships to the performance characteristics (output responses) in terms of process variables and…

Abstract

Purpose

This paper adopts a modified Taguchi approach to develop empirical relationships to the performance characteristics (output responses) in terms of process variables and demonstrated their validity through comparison of test data. The method suggests a few tests as per the orthogonal array and provides complete information for all combinations of levels and process variables. This method also provides the estimated range of output responses so that the scatter in the repeated tests can be assessed prior to the tests.

Design/methodology/approach

In order to obtain defect-free products meeting the required specifications, researchers have conducted extensive experiments using powder bed fusion (PBF) process measuring the performance indicators (namely, relative density, surface roughness and hardness) to specify a set of printing parameters (namely, laser power, scanning speed and hatch spacing). A simple and reliable multi-objective optimization method is considered in this paper for specifying a set of optimal process parameters with SS316 L powder. It was reported that test samples printed even with optimal set of input variables revealed irregular shaped, microscopic porosities and improper melt pool formation.

Findings

Finally, based on detailed analysis, it is concluded that it is impossible to express the performance indicators, explicitly in terms of equivalent energy density (E_0ˆ*), which is a combination of multiple sets of selective laser melting (SLM) process parameters, with different performance indicators. Empirical relations for the performance indicators are developed in terms of SLM process parameters. Test data are within/close to the expected range.

Practical implications

Based on extensive analysis of the SS316 L data using modified Taguchi approach, the optimized process parameters are laser power = 298 W, scanning speed = 900 mm/s and hatch distance = 0.075 mm, for which the results of surface roughness = 2.77 Ra, relative density = 99.24%, hardness = 334 Hv and equivalent energy density is 4.062. The estimated data for the same are surface roughness is 3.733 Ra, relative density is 99.926%, hardness is 213.64 Hv and equivalent energy density is 3.677.

Originality/value

Even though equivalent energy density represents the energy input to the process, the findings of this paper conclude that energy density should no longer be considered as a dependent process parameter, as it provides multiple results for the specified energy density. This aspect has been successfully demonstrated in this paper using test data.

Details

Multidiscipline Modeling in Materials and Structures, vol. 20 no. 1
Type: Research Article
ISSN: 1573-6105

Keywords

Book part
Publication date: 13 May 2017

Hugo Jales and Zhengfei Yu

This chapter reviews recent developments in the density discontinuity approach. It is well known that agents having perfect control of the forcing variable will invalidate the…

Abstract

This chapter reviews recent developments in the density discontinuity approach. It is well known that agents having perfect control of the forcing variable will invalidate the popular regression discontinuity designs (RDDs). To detect the manipulation of the forcing variable, McCrary (2008) developed a test based on the discontinuity in the density around the threshold. Recent papers have noted that the sorting patterns around the threshold are often either the researcher’s object of interest or may relate to structural parameters such as tax elasticities through known functions. This, in turn, implies that the behavior of the distribution around the threshold is not only informative of the validity of a standard RDD; it can also be used to recover policy-relevant parameters and perform counterfactual exercises.

Details

Regression Discontinuity Designs
Type: Book
ISBN: 978-1-78714-390-6

Keywords

Article
Publication date: 28 August 2020

Qingying Wang, Rongjun Cheng and Hongxia Ge

The purpose of this paper is to explore how curved road and lane-changing rates affect the stability of traffic flow.

192

Abstract

Purpose

The purpose of this paper is to explore how curved road and lane-changing rates affect the stability of traffic flow.

Design/methodology/approach

An extended two-lane lattice hydrodynamic model on a curved road accounting for the empirical lane-changing rate is presented. The linear analysis of the new model is discussed, the stability condition and the neutral stability condition are obtained. Also, the mKdV equation and its solution are proposed through nonlinear analysis, which discusses the stability of the extended model in the unstable region. Furthermore, the results of theoretical analysis are verified by numerical simulation.

Findings

The empirical lane-changing rate on a curved road is an important factor, which can alleviate traffic congestion.

Research limitations/implications

This paper does not take into account the factors such as slope, the drivers’ characters and so on in the actual traffic, which will have more or less influence on the stability of traffic flow, so there is still a certain gap with the real traffic environment.

Originality/value

The curved road and empirical lane-changing rate are researched simultaneously in a two-lane lattice hydrodynamic models in this paper. The improved model can better reflect the actual traffic, which can also provide a theoretical reference for the actual traffic governance.

Details

Engineering Computations, vol. 38 no. 4
Type: Research Article
ISSN: 0264-4401

Keywords

Book part
Publication date: 5 April 2024

Alecos Papadopoulos

The author develops a bilateral Nash bargaining model under value uncertainty and private/asymmetric information, combining ideas from axiomatic and strategic bargaining theory…

Abstract

The author develops a bilateral Nash bargaining model under value uncertainty and private/asymmetric information, combining ideas from axiomatic and strategic bargaining theory. The solution to the model leads organically to a two-tier stochastic frontier (2TSF) setup with intra-error dependence. The author presents two different statistical specifications to estimate the model, one that accounts for regressor endogeneity using copulas, the other able to identify separately the bargaining power from the private information effects at the individual level. An empirical application using a matched employer–employee data set (MEEDS) from Zambia and a second using another one from Ghana showcase the applied potential of the approach.

Article
Publication date: 14 September 2010

Kirsten Martinus

The purpose of this paper is to provide conceptual foundations for a study exploring the capacity of hard infrastructure and amenities to influence the socio‐economic imprint of

1773

Abstract

Purpose

The purpose of this paper is to provide conceptual foundations for a study exploring the capacity of hard infrastructure and amenities to influence the socio‐economic imprint of urban spaces. The paper argues that some urban developments are more economically efficient in generating innovation and knowledge than others.

Design/methodology/approach

The paper reviews the debate between urban density and infrastructure. Drawing on empirical evidence and economic production theory, it explores the spatial links between economic growth, innovation and knowledge productivity. It argues that the growing role of human capital in the production process has linked productivity to a city's mix and levels of infrastructure and amenities. It reviews five key infrastructure types for knowledge‐based developments.

Findings

This paper finds that the positive contribution of density to urban vibrancy and human connectivity is constrained by a city's infrastructure and amenity levels. It concludes that urban development cognisant of an appropriate mix and level of infrastructure and amenities will more likely enhance regional knowledge development and innovation than those which are not.

Social implications

The evidence presented in this paper has a broad range of strategic and practical socio‐economic implications, and contributes towards understanding how urban form can leverage social aspects of a city for economic growth.

Originality/value

Using an inter‐disciplinarian approach, this paper provides invaluable insights into the types of infrastructure and importance of urban form for knowledge‐based development. It contends that well‐planned knowledge‐based developments can be leveraged to ensure the successful implementation and delivery of national innovation and productivity priorities.

Details

Journal of Knowledge Management, vol. 14 no. 5
Type: Research Article
ISSN: 1367-3270

Keywords

Book part
Publication date: 21 November 2014

Jan F. Kiviet and Jerzy Niemczyk

IV estimation is examined when some instruments may be invalid. This is relevant because the initial just-identifying orthogonality conditions are untestable, whereas their…

Abstract

IV estimation is examined when some instruments may be invalid. This is relevant because the initial just-identifying orthogonality conditions are untestable, whereas their validity is required when testing the orthogonality of additional instruments by so-called overidentification restriction tests. Moreover, these tests have limited power when samples are small, especially when instruments are weak. Distinguishing between conditional and unconditional settings, we analyze the limiting distribution of inconsistent IV and examine normal first-order asymptotic approximations to its density in finite samples. For simple classes of models we compare these approximations with their simulated empirical counterparts over almost the full parameter space. The latter is expressed in measures for: model fit, simultaneity, instrument invalidity, and instrument weakness. Our major findings are that for the accuracy of large sample asymptotic approximations instrument weakness is much more detrimental than instrument invalidity. Also, IV estimators obtained from strong but possibly invalid instruments are usually much closer to the true parameter values than those obtained from valid but weak instruments.

1 – 10 of over 15000