Search results
1 – 10 of over 62000John O'Neill, Barry Bloom and Khoa Tang
The purpose of this paper is to be the first empirical article to provide necessary standard deviation inputs for adoption in probabilistic prognostications of hotel revenues and…
Abstract
Purpose
The purpose of this paper is to be the first empirical article to provide necessary standard deviation inputs for adoption in probabilistic prognostications of hotel revenues and expenses, i.e. prognostications that consider risk. Commonly accepted methodologies to develop hotel financial projections resulting in point estimates of upcoming performance have been perceived as egregiously insufficient because they do not consider risk in lodging investments. Previous research has recommended the use of probabilistic methodologies to address this concern, and it has been recommended that analysts use Monte Carlo simulation. This methodology requires the estimation of standard deviations of specific, future hotel revenue and expense items, and this paper provides such inputs based on a large sample of actual, recent data.
Design/methodology/approach
This study provides actual standard deviations using a sample of recent hotel profit and loss (P&L) statements for over 3,000 hotels (Over 19,000 P&L statements) to provide analysts with empirically-supported standard deviations that may be applied to Uniform System of Accounts for the Lodging Industry (USALI) hotel revenues and expenses in hotel financial (revenue and expense) prognostications.
Findings
Findings are presented for standard deviations based on typical line items as defined in the USALI, and these findings may be used by practitioners as inputs for hotel financial projections. Findings also include that hotel revenue items generally have higher standard deviations than expense items. Findings are presented in detail in the manuscript, including overall findings, as well as findings based on hotel class.
Practical implications
Rather than practitioners adopting standard deviations of hotel revenue and expense line items based on guesswork or judgment, which is the current “state of the art” in hotel financial projections, this paper provides practitioners with actual standard deviations which may be adopted in probabilistic prognostications of hotel revenues and expenses.
Originality/value
This paper may be the first to provide practitioners with actual standard deviations, based on typical USALI line items, for adoption in probabilistic prognostications of hotel revenues and expenses.
Details
Keywords
Means, medians and SD for available socio‐economic status (SES) black‐white differences are here substituted for those of IQ in a between‐groups model published by the author over…
Abstract
Means, medians and SD for available socio‐economic status (SES) black‐white differences are here substituted for those of IQ in a between‐groups model published by the author over a decade ago. The goodness of fit of the SES variables used is compared with that for the earlier IQ data. Even when SES variables are relatively successful this can be viewed as additional evidence of the importance of IQ differences to black‐white differences in delinquency.
Details
Keywords
The quantitative assessment of the degree of risk associated with the direct acquisition of commercial property for investment purposes is practically non‐existent. There is…
Abstract
The quantitative assessment of the degree of risk associated with the direct acquisition of commercial property for investment purposes is practically non‐existent. There is almost always a total reliance on unquantified subjective feeling with no attempt to transform such a qualitative treatment into an analytically more acceptable and useful form. Whilst the investment capitalisation rate should, to an extent, reflect the investor's view of the future earnings capacity of a particular property, this yield rate is principally a function of general market sentiment and may not significantly allow for the inherent risk characteristics of an individual investment. This is especially the case at the prime end of the market where the pressure of funds competing to invest in a sector of particularly limited supply remains most severe.
Moustafa Omar Ahmed Abu‐Shawiesh
This paper seeks to propose a univariate robust control chart for location and the necessary table of factors for computing the control limits and the central line as an…
Abstract
Purpose
This paper seeks to propose a univariate robust control chart for location and the necessary table of factors for computing the control limits and the central line as an alternative to the Shewhart X¯ control chart.
Design/methodology/approach
The proposed method is based on two robust estimators, namely, the sample median, MD, to estimate the process mean, μ, and the median absolute deviation from the sample median, MAD, to estimate the process standard deviation, σ. A numerical example was given and a simulation study was conducted in order to illustrate the performance of the proposed method and compare it with that of the traditional Shewhart X¯ control chart.
Findings
The proposed robust X¯MDMAD control chart gives better performance than the traditional Shewhart X¯ control chart if the underlying distribution of chance causes is non‐normal. It has good properties for heavy‐tailed distribution functions and moderate sample sizes and it compares favorably with the traditional Shewhart X¯ control chart.
Originality/value
The most common statistical process control (SPC) tool is the traditional Shewhart X¯ control chart. The chart is used to monitor the process mean based on the assumption that the underlying distribution of the quality characteristic is normal and there is no major contamination due to outliers. The sample mean, X¯, and the sample standard deviation, S, are the most efficient location and scale estimators for the normal distribution often used to construct the X¯ control chart, but the sample mean, X¯, and the sample standard deviation, S, might not be the best choices when one or both assumptions are not met. Therefore, the need for alternatives to the X¯ control chart comes into play. The literature shows that the sample median, MD, and the median absolute deviation from the sample median, MAD, are indeed more resistant to departures from normality and the presence of outliers.
Details
Keywords
Huahan Liu, Qiang Dong and Wei Jiang
The purpose of this paper is to present a new methodology, used for dynamic reliability analysis of a gear transmission system (GTS) of wind turbine (WT), which could be used for…
Abstract
Purpose
The purpose of this paper is to present a new methodology, used for dynamic reliability analysis of a gear transmission system (GTS) of wind turbine (WT), which could be used for assembly decision-making of the parts with errors to improve the GTS’s performance.
Design/methodology/approach
This paper involves the dynamic and dynamic reliability analysis of a GTS. The history curves of dynamic responses of the parts are obtained with the developed gear-bearing coupling dynamic model considering the random errors, failure dependency and random load. Then, the surrogate models of the mean and standard deviation of responses are presented by statistics, rain flow counting method and corrected-partial least squares regression response surface method. Further, a novel dynamic reliability model based on the maximum extreme theory, a theory of sequential statistics, equivalent principles and the inverse transform theory of random variable sampling, is developed to overcome the limitations of traditional methods.
Findings
The dynamic reliability of GTS considering the different impact factors are evaluated. The proposed reliability methodology not only overcomes the limitations associated with traditional approaches but also provides good guidance to assembly the parts in a GTS to its best performance.
Originality/value
Instead of constant errors, this paper considers the randomness of the impact factors to develop the dynamic reliability model. Further, instead of the limitation of the normal distribution of the random parameters in the traditional method, the proposed methodology can deal with the problems with non-normal distribution parameters, which is more suitable for the real engineering problems.
Details
Keywords
Mohsen Bahmani‐Oskooee and Scott W. Hegerty
Since the last review article by McKenzie, the literature has experienced a surge in the number of empirical articles. These new contributions, coupled with those that were…
Abstract
Purpose
Since the last review article by McKenzie, the literature has experienced a surge in the number of empirical articles. These new contributions, coupled with those that were overlooked by McKenzie, set the stage for this review. Many of the recent studies have been empirical in nature and these deserve specific attention. Thus, this paper aims to survey and review all of the studies by paying attention to the attributes outlined in the text.
Design/methodology/approach
This paper examines the vast empirical literature, up to 2005, to assess the main trends in modeling and estimating these trade flows at the aggregate, bilateral, and sectoral levels.
Findings
The increase in exchange‐rate volatility since 1973 has had indeterminate effects on international export and import flows. Although it can be assumed that an increase in risk may lead to a reduction in economic activity, the theoretical literature provides justifications for positive or insignificant effects as well. Similar results have been found in empirical tests. While modeling techniques have evolved over time to incorporate new developments in econometric analysis, no single measure of exchange‐rate volatility has dominated the literature.
Originality/value
An argument put forward by the opponents of the floating exchange rates is that such rates introduce uncertainty into the foreign exchange market, which could deter trade flows. However, a theoretical argument is put forward by some to show that uncertainty could also boost trade flows if traders increase their trade volume to offset any decrease in future revenue due to exchange rate volatility. The empirical literature reviewed in this paper supports both views.
Details
Keywords
D.R. Prajapati and P.B. Mahapatra
The purpose of this paper is to introduce a new design of an R chart to catch smaller shifts in the process dispersion as well as maintaining the simplicity so that it may be…
Abstract
Purpose
The purpose of this paper is to introduce a new design of an R chart to catch smaller shifts in the process dispersion as well as maintaining the simplicity so that it may be applied at shopfloor level.
Design/methodology/approach
Here a new R chart has been proposed which can overcome the limitations of Shewhart, CUSUM and EWMA range charts. The concept of this R chart is based on chi‐square (χ2) distribution. Although CUSUM and EWMA charts are very useful for catching the small shifts in the mean or standard deviation, they can catch the process shift only when there is a single and sustained shift in process average or standard deviation.
Findings
It was found that the proposed chart performs significantly better than the conventional (Shewhart) R chart, CUSUM range schemes proposed by Chang and Gan for most of the process shifts in standard deviation. The ARLs of the proposed R chart is higher than ARLs of CUSUM schemes for only ten cases out of 40. The performance of the proposed R chart has also been compared with the variance chart proposed by Chang and Gan for various shifts in standard deviation. The ARLs of the proposed R chart are compared with Chang's R chart for sample sizes of 3 and it can be concluded from the comparisons that the proposed R chart is much better than Chang's variance chart for all shift ratios for sample size of three. Many difficulties related to the operation and design of CUSUM and EWMA control charts are greatly reduced by providing a simple and accurate proposed R chart scheme. The performance characteristics (ARLs) of the proposed charts are very comparable to a great degree with FIR CUSUM, simple CUSUM and other variance charts. It can be concluded that, instead of considering many parameters, it is better to consider single sample size and single control limits because a control chart loses its simplicity with a greater number of parameters. Moreover, practitioners may also find difficulty in applying it in production processes. On the other hand, CUSUM control charts are not effective when there is a single and sustained shift in the process dispersion.
Research limitations/implications
A lot of effort has been done to develop the new range charts for monitoring the process dispersion. Various assumptions and factors affecting the performance of the R chart have been identified and taken into account. In the proposed design, the observations have been assumed independent of one another but the observations may also be assumed to be auto‐correlated with previous observations and the performance of the proposed R chart may be studied.
Originality/value
The research findings could be applied to various manufacturing and service industries as it is more effective than the conventional (Shewhart) R chart and simpler than CUSUM charts.
Details
Keywords
The purpose of this paper is to show that major reversals of an index (specifically BIST-30 index) can be detected uniquely on the date of reversal by checking the extreme…
Abstract
Purpose
The purpose of this paper is to show that major reversals of an index (specifically BIST-30 index) can be detected uniquely on the date of reversal by checking the extreme outliers in the rate of change series using daily closing prices.
Design/methodology/approach
The extreme outliers are determined by checking if either the rate of change series or the volatility of the rate of change series displays more than two standard deviations on the date of reversal. Furthermore; wavelet analysis is also utilized for this purpose by checking the extreme outlier characteristics of the A1 (approximation level 1) and D3 (detail level 3) wavelet components.
Findings
Paper investigates ten major reversals of BIST-30 index during a five year period. It conclusively shows that all these major reversals are characterized by extreme outliers mentioned above. The paper also checks if these major reversals are unique in the sense of being observed only on the date of reversal but not before. The empirical results confirm the uniqueness. The paper also demonstrates empirically the fact that extreme outliers are associated only with major reversals but not minor ones.
Practical implications
The results are important for fund managers for whom the timely identification of the initial phase of a major bullish or bearish trend is crucial. Such timely identification of the major reversals is also important for the hedging applications since a major issue in the practical implementation of the stock index futures as a hedging instrument is the correct timing of derivatives positions.
Originality/value
To the best of the author’ knowledge; this is the first study dealing with the issue of major reversal identification. This is evidently so for the BIST-30 index and the use of extreme outliers for this purpose is also a novelty in the sense that neither the use of rate of change extremity nor the use of wavelet decomposition for this purpose was addressed before in the international literature.
Details
Keywords
Chih-Chen Hsu, Kai-Chieh Chia and Yu-Chieh Chang
This study investigates the efficiency of value relevance and faithful representation when stock market price derivates from its firm value to the investigated IT companies listed…
Abstract
This study investigates the efficiency of value relevance and faithful representation when stock market price derivates from its firm value to the investigated IT companies listed in FTSE Taiwan 50. The empirical investigation reveals one financial indicators: Return on equity (ROE) has explanatory ability among seven financial indicators, earnings per share (EPS), book value (BV), dividend yield (Div.), price–earnings ratio (P/E), ROE, return on assets (ROA), and return on operating asset (ROOA) to both sampled companies, United Microelectronics Corporation, UMC, (2303) and Taiwan Semiconductor Manufacturing Company Limited, TSMC, (2330). Furthermore, the empirical results indicate that the higher order moments, skewness and kurtosis, of price deviation do not provide a reliable prediction or explanatory power for stock price trends.
Details
Keywords
Aims to test to determine whether the selection of the historical return time interval (monthly, quarterly, semiannual, or annual) used for calculating real estate investment…
Abstract
Purpose
Aims to test to determine whether the selection of the historical return time interval (monthly, quarterly, semiannual, or annual) used for calculating real estate investment trust (REIT) returns has a significant effect on optimal portfolio allocations.
Design/methodology/approach
Using a mean‐variance utility function, optimal allocations to portfolios of stocks, bonds, bills, and REITs across different levels of assumed investor risk aversion are calculated. The average historical returns, standard deviations, and correlations (assuming different time intervals) of the various asset classes are used as mean‐variance inputs. Results are also compared using more recent data, since 1988, with, data from the full REIT history, which goes back to 1972.
Findings
Using the more recent REIT datarather than the full dataset results in optimal allocations to REITs that are considerably higher. Likewise, using monthly and quarterly returns tends to understate the variability of REITs and leads to higher portfolio allocations.
Research limitations/implications
The results of this study are based on the limited historical return data that are currently available for REITs. The results of future time periods may not prove to be consistent with the findings.
Practical implications
Numerous research papers arbitrarily decide to employ monthly or quarterly returns in their analyses to increase the number of REIT observations they have available. These shorter interval returns are generally annualized. This paper addresses the consequences of those decisions.
Originality/value
It has been shown that the decision to use return estimation intervals shorter than a year does have dramatic consequences on the results obtained and, therefore, must be carefully considered and justified.
Details