The purpose of this paper is to present an algorithm of real estate mass appraisal in which the impact of attributes (real estate features) is estimated by inequality…
The purpose of this paper is to present an algorithm of real estate mass appraisal in which the impact of attributes (real estate features) is estimated by inequality restricted least squares (IRLS) model.
This paper presents the algorithm of real estate mass appraisal, which was also presented in the form of an econometric model. Vital problem related to econometric models of mass appraisal is multicollinearity. In this paper, a priori knowledge about parameters is used by imposing restrictions in the form of inequalities. IRLS model is therefore used to limit negative consequences of multicollinearity. In ordinary least squares (OLS) models, estimator variances might be inflated by multicollinearity, which could lead to wrong signs of estimates. In IRLS models, estimators efficiency is higher (estimator variances are lower), which could result in better appraisals.
The final effect of the analysis is a vector of the impact of real estate attributes on their value in the mass appraisal algorithm. After making expert corrections, the algorithm was used to evaluate 318 properties from the test set. Valuation errors were also discussed.
Restrictions in the form of inequalities were imposed on the parameters of the econometric model, ensuring the non-negativity and monotonicity of real estate attribute impact. In case of real estate, variables are usually correlated. OLS estimators are then inflated and inefficient. Imposing restrictions in form of inequalities could improve results because IRLS estimators are more efficient. In the case of results inconsistent with theoretical assumptions, the real estate mass appraisal algorithm enables having the obtained results adjusted by an expert. This can be important for low quality databases, which is often the case in underdeveloped real estate markets. Another reason for expert correction may be the low efficiency of a given real estate market.
After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making risk, some of the issues and needs that he mentions are discussed and linked to past and present Bayesian econometric research. Then a review of some recent Bayesian econometric research and needs is presented. Finally, some thoughts are presented that relate to the future of Bayesian econometrics.
This article seeks to (1) identify forecasting techniques used to estimate taxable sales in California counties; (2) analyze which of these produces the most accurate…
This article seeks to (1) identify forecasting techniques used to estimate taxable sales in California counties; (2) analyze which of these produces the most accurate estimate; (3) document what prevented officials from using the most accurate forecasting technique in California counties; and (4) determine what forecasting approach would work best for individual counties. This research generally confirms previous research findings that judgmental approaches are the most commonly used method of revenue forecasting in smaller localities. In terms of accuracy, econometric models outperform other quantitative methods, particularly compared to trend line fitting and extrapolation-by-average approaches. The “not now but later” perception in the use of econometric models can be ascribed to California county forecasters’ discomfort and lack of preparation for using this sophisticated technique. Once the critical prerequisites for the use of econometric models are provided -- such as statewide training, timely inter-governmental data sharing, easy access to economic data, and user-friendly forecasting formats with automated procedures -- econometric models can serve the needs of California counties.
In antitrust class-action litigation, courts are increasingly unlikely to accept the presumption that all class members were harmed by price-fixing among a group of firms or by exclusionary behavior by a single firm. Econometric methods typically applied in antitrust and other settings estimate the average effect of the challenged conduct, but do not inform impact for individual class members. We present classwide econometric methods and statistical tests for detecting the existence (or lack thereof) of common impact and determining what proportion (if any) of the proposed class suffered injury in many class actions. We conclude that econometric tools can meaningfully inform the legal process, even when courts demand proof of common impact.
This paper examines the diffusion of Jerry Hausman's econometric ideas using citation counts, citing authors, and source journals of his most referenced citers. Bibliographic information and citation counts of references to econometrics papers were retrieved from Thomson Reuters Web of Science and analyzed to determine the various ways in which Hausman's ideas have spread in econometrics and related disciplines. Econometric growth analysis (Gompertz and logistic functions) is used to measure the diffusion of his contributions. This analysis reveals that the diffusion of Hausman's ideas has been pervasive over time and disciplines. For example, his seminal 1978 paper continues to be strongly cited along exponential growth with total cites mainly in econometrics and other fields such as administrative management, human resources, and psychology. Some of the more recent papers have a growth pattern that resembles that of the 1978 paper. This leads us to conclude that Hausman's econometric contributions will continue to diffuse in years to come. It was also found that five journals have published the bulk of the top cited papers that list Hausman as a reference, namely, Econometrica, Journal of Econometrics, Review of Economic Studies, Academy of Management Journal, and the Journal of Economic Literature. “Specification tests in econometrics” is Hausman's dominant contribution in this citation analysis. We found no previous research on the econometric modeling of citation counts as done in this paper. Thus, we expect to stimulate methodological improvements in future work.
This study uses the neural network and econometric models to explore the importance of fiscal and monetary policy on GNP. The findings suggest that fiscal policy is more…
This study uses the neural network and econometric models to explore the importance of fiscal and monetary policy on GNP. The findings suggest that fiscal policy is more influential than monetary policy, and the neural network forecasts of GNP are more accurate and have less variation than those of the econometric approach.
Fiscal stress has forced local governments to pay increasing attention to revenue trends and has increased the importance of financial forecasting in local government…
Fiscal stress has forced local governments to pay increasing attention to revenue trends and has increased the importance of financial forecasting in local government. After reviewing the role of revenue forecasting in financial planning and discussing the use of regression and econometric analysis in revenue forecasting, this article applies this technique to forecast several key revenue components in a medium-sized city. Three general conclusions may be drawn: (1) systematic revenue forecasting and long-range planning are necessities, not luxuries, (2) risk aversion to "technical" revenue forecasting can be overcome, and (3) the implementation of a systematic revenue forecasting system does not require a battery of "rocket scientists." As municipal revenue bases come to rely less on relatively stable property taxes and more on less stable sources such as sales taxes, fees, and charges, the use of a regression and econometric based model should prove increasingly fruitful.