Search results

1 – 10 of over 86000
Article
Publication date: 19 April 2011

Shihchieh Chou, Chinyi Cheng and Szujui Huang

The purpose of this paper is to establish a new approach for solving the expansion term problem.

Abstract

Purpose

The purpose of this paper is to establish a new approach for solving the expansion term problem.

Design/methodology/approach

This study develops an expansion term weighting function derived from the valuable concepts used by previous approaches. These concepts include probability measurement, adjustment according to situations, and summation of weights. Formal tests have been conducted to compare the proposed weighting function with the baseline ranking model and other weighting functions.

Findings

The results reveal stable performance by the proposed expansion term weighting function. It proves more effective than the baseline ranking model and outperforms other weighting functions.

Research limitations/implications

The paper finds that testing additional data sets and potential applications to real working situations is required before the generalisability and superiority of the proposed expansion term weighting function can be asserted.

Originality/value

Stable performance and an acceptable level of effectiveness for the proposed expansion term weighting function indicate the potential for further study and development of this approach. This would add to the current methods studied by the information retrieval community for culling information from documents.

Details

Online Information Review, vol. 35 no. 2
Type: Research Article
ISSN: 1468-4527

Keywords

Article
Publication date: 3 December 2018

Selcuk Cebi and Cengiz Kahraman

The purpose of this paper is to propose a novel weighting algorithm for fuzzy information axiom (IA) and to apply it to the evaluation process of 3D printers.

Abstract

Purpose

The purpose of this paper is to propose a novel weighting algorithm for fuzzy information axiom (IA) and to apply it to the evaluation process of 3D printers.

Design/methodology/approach

As a decision-making tool, IA method is presented to evaluate the performance of any design. Then, weighted IA methods are investigated and a new weighting procedure is introduced to the literature. Then, the existing axiomatic design methods and the proposed new method are classified into two groups: weighting based on information content and weighting based on design ranges. The weighting based on information content approach consists of four methods including pessimistic and optimistic approaches. The philosophy of the weighting based on design ranges is to narrow design ranges in order to decrease fuzziness in the model. To prove the robustness and the performance of the proposed weighting method, the results are compared with the existing methods in the literature. Then, the new approach is applied to evaluate 3D printers.

Findings

The results of the proposed study show that the proposed weighting algorithm has better performance than the old ones for IA. Therefore, the proposed weighting algorithm should be used for the weighting tool of IA thereafter.

Originality/value

An effective weighting method compatible with the philosophy of IA method has been proposed. Furthermore, the performances of 3D printers are compared by using the proposed method.

Details

Journal of Enterprise Information Management, vol. 32 no. 1
Type: Research Article
ISSN: 1741-0398

Keywords

Article
Publication date: 14 September 2010

Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou and Dimitrios Zevgolis

Evaluation of energy and climate policy interactions is a complex issue, whereas stakeholders' preferences incorporation has not been addressed systematically. The purpose of this…

1567

Abstract

Purpose

Evaluation of energy and climate policy interactions is a complex issue, whereas stakeholders' preferences incorporation has not been addressed systematically. The purpose of this paper is to present an integrated weighting methodology that has been developed in order to incorporate weighting preferences into an ex ante evaluation of climate and energy policy interactions.

Design/methodology/approach

A multi‐criteria analysis (MCA) weighting methodology which combines pair‐wise comparisons and ratio importance weighting methods has been elaborated. It initially introduces the users to the evaluation process through a warming up holistic approach for an initial rank of the criteria and then facilitates them to express their ratio relative importance in pair‐wise comparisons of criteria by providing them an interactive mean with verbal, numerical and visual representation of their preferences. Moreover, it provides a ranking consistency test where users can see the degree of (in)consistency of their preferences.

Findings

Stakeholders and experts in the energy policy field who tested the methodology stated their approval and satisfaction for the combination of both ranking and pair‐wise comparison techniques, since it allows the gradual approach to the evaluation problem. In addition, main difficulties in MCA weights elicitation processes were overcome.

Research limitations/implications

The methodology is tested by a small sample of stakeholders, whereas a larger sample, a broader range of stakeholders and applications on different climate policy evaluation cases merit further research.

Originality/value

The novel aspect of the developed methodology consists of the combination of ranking and pair‐wise comparison techniques for the elicitation of stakeholders' preferences.

Details

International Journal of Energy Sector Management, vol. 4 no. 3
Type: Research Article
ISSN: 1750-6220

Keywords

Article
Publication date: 28 August 2008

May H. Lo and Le (Emily) Xu

The purpose of this study is to examine whether financial analysts mislead investors in recognizing the differential persistence of the three cash flow components of earnings…

Abstract

Purpose

The purpose of this study is to examine whether financial analysts mislead investors in recognizing the differential persistence of the three cash flow components of earnings, defined by Dechow et al., in forecasting annual earnings.

Design/methodology/approach

The paper uses Mishkin's econometric approach to compare the persistence of the cash flow components within and across the historical, analysts' and investors' weightings.

Findings

It is found that financial analysts' weightings of the cash flow components are more closely aligned with the historical relations than are investors' weightings, both in direction and in magnitude. The degree of analysts' mis‐weighting is economically small and much lower than the degree of investors' mis‐weighting. Moreover, the extent of both investors' and analysts' mis‐weightings of the cash components is generally smaller for firms with greater levels of analyst following, a proxy for the quality of the information environment.

Research limitations/implications

The findings suggest that financial analysts' bias in weighting the cash components of earnings is at best a partial explanation for investors' bias.

Practical implications

This study is important to academics and the investment community that relies upon financial analysts as information intermediaries, because the ability of analysts to incorporate value‐relevant information in their published expectations may impact securities prices.

Originality/value

The study is the first to document the weightings of the cash components of earnings by financial analysts. In addition, this paper provides evidence that financial analysts, as information intermediaries, are less biased than investors in processing not only the accrual but also the cash components of earnings.

Details

Accounting Research Journal, vol. 21 no. 1
Type: Research Article
ISSN: 1030-9616

Keywords

Article
Publication date: 30 October 2020

Adrija Majumdar and Arnab Adhikari

In the context of sharing economy, the superhost program of Airbnb emerges as a phenomenal success story that has transformed the tourism industry and garnered humongous…

Abstract

Purpose

In the context of sharing economy, the superhost program of Airbnb emerges as a phenomenal success story that has transformed the tourism industry and garnered humongous popularity. Proper performance evaluation and classification of the superhosts are crucial to incentivize superhosts to maintain higher service quality. The main objective of this paper is to design an integrated multicriteria decision-making (MCDM) method-based performance evaluation and classification framework for the superhosts of Airbnb and to study the variation in various contextual factors such as price, number of listings and cancelation policy across the superhosts.

Design/methodology/approach

This work considers three weighting techniques, mean, entropy and CRITIC-based methods to determine the weights of factors. For each of the weighting techniques, an integrated TOPSIS-MOORA-based performance evaluation method and classification framework have been developed. The proposed methodology has been applied for the performance evaluation of the superhosts (7,308) of New York City using real data from Airbnb.

Findings

From the perspective of performance evaluation, the importance of devising an integrated methodology instead of adopting a single approach has been highlighted using a nonparametric Wilcoxon signed-rank test. As per the context-specific findings, it has been observed that the price and the number of listings are the highest for the superhosts in the topmost category.

Practical implications

The proposed methodology facilitates the design of a leaderboard to motivate service providers to perform better. Also, it can be applicable in other accommodation-sharing economy platforms and ride-sharing platforms.

Originality/value

This is the first work that proposes a performance evaluation and classification framework for the service providers of the sharing economy in the context of tourism industry.

Details

Benchmarking: An International Journal, vol. 28 no. 2
Type: Research Article
ISSN: 1463-5771

Keywords

Article
Publication date: 1 January 1979

KAREN SPARCK JONES

Previous experiments demonstrated the value of relevance weighting for search terms, but relied on substantial relevance information for the terms. The present experiments were…

Abstract

Previous experiments demonstrated the value of relevance weighting for search terms, but relied on substantial relevance information for the terms. The present experiments were designed to study the effects of weights based on very limited relevance information, for example supplied by one or two relevant documents. The tests simulated iterative searching, as in an on‐line system, and show that even very little relevance information can be of considerable value.

Details

Journal of Documentation, vol. 35 no. 1
Type: Research Article
ISSN: 0022-0418

Article
Publication date: 9 August 2011

Hamza Bahaji

This paper aims to analyze the valuation of stock options from the perspective of an employee exhibiting preferences as described by cumulative prospect theory (CPT). In addition…

1583

Abstract

Purpose

This paper aims to analyze the valuation of stock options from the perspective of an employee exhibiting preferences as described by cumulative prospect theory (CPT). In addition, it elaborates on their incentives effect and some implications in terms of design aspects.

Design/methodology/approach

The paper draws on the CPT framework to derive a continuous time model of the stock option subjective value using the certainty equivalence principle. Numerical simulations are used in order to analyze the subjective value sensitivity with respect to preferences‐related parameters and to investigate the incentives effect.

Findings

Consistent with a growing body of empirical and experimental studies, the model predicts that the employee may overestimate the value of his options in‐excess of their risk‐neutral value. Moreover, for typical setting of preferences parameters around the experimental estimates, and assuming the company is allowed to adjust existing compensation when making new stock option grants, the model predicts that incentives are maximized for strike prices set around the stock price at inception. This finding is consistent with companies’ actual compensation practices. Finally, the model predicts that an executive who is subject to probability weighting may be more prompted than a risk‐neutral executive to act in order to increase the firm's assets volatility.

Originality/value

This research proposes an alternative theoretical framework for the analysis of pay‐to‐performance sensitivity of equity‐based compensation that takes into account a number of prominent patterns of employee behavior that expected utility theory cannot explain. It contributes to recent empirical and theoretical researches that have advanced CPT framework as a promising candidate for the analysis of equity‐based compensation contracts.

Details

Review of Accounting and Finance, vol. 10 no. 3
Type: Research Article
ISSN: 1475-7702

Keywords

Article
Publication date: 8 April 2021

Mariem Bounabi, Karim Elmoutaouakil and Khalid Satori

This paper aims to present a new term weighting approach for text classification as a text mining task. The original method, neutrosophic term frequency – inverse term frequency…

Abstract

Purpose

This paper aims to present a new term weighting approach for text classification as a text mining task. The original method, neutrosophic term frequency – inverse term frequency (NTF-IDF), is an extended version of the popular fuzzy TF-IDF (FTF-IDF) and uses the neutrosophic reasoning to analyze and generate weights for terms in natural languages. The paper also propose a comparative study between the popular FTF-IDF and NTF-IDF and their impacts on different machine learning (ML) classifiers for document categorization goals.

Design/methodology/approach

After preprocessing textual data, the original Neutrosophic TF-IDF applies the neutrosophic inference system (NIS) to produce weights for terms representing a document. Using the local frequency TF, global frequency IDF and text N's length as NIS inputs, this study generate two neutrosophic weights for a given term. The first measure provides information on the relevance degree for a word, and the second one represents their ambiguity degree. Next, the Zhang combination function is applied to combine neutrosophic weights outputs and present the final term weight, inserted in the document's representative vector. To analyze the NTF-IDF impact on the classification phase, this study uses a set of ML algorithms.

Findings

Practicing the neutrosophic logic (NL) characteristics, the authors have been able to study the ambiguity of the terms and their degree of relevance to represent a document. NL's choice has proven its effectiveness in defining significant text vectorization weights, especially for text classification tasks. The experimentation part demonstrates that the new method positively impacts the categorization. Moreover, the adopted system's recognition rate is higher than 91%, an accuracy score not attained using the FTF-IDF. Also, using benchmarked data sets, in different text mining fields, and many ML classifiers, i.e. SVM and Feed-Forward Network, and applying the proposed term scores NTF-IDF improves the accuracy by 10%.

Originality/value

The novelty of this paper lies in two aspects. First, a new term weighting method, which uses the term frequencies as components to define the relevance and the ambiguity of term; second, the application of NL to infer weights is considered as an original model in this paper, which also aims to correct the shortcomings of the FTF-IDF which uses fuzzy logic and its drawbacks. The introduced technique was combined with different ML models to improve the accuracy and relevance of the obtained feature vectors to fed the classification mechanism.

Details

International Journal of Web Information Systems, vol. 17 no. 3
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 22 September 2021

Fatemeh Chahkotahi and Mehdi Khashei

Improving the accuracy and reducing computational costs of predictions, especially the prediction of time series, is one of the most critical parts of the decision-making…

Abstract

Purpose

Improving the accuracy and reducing computational costs of predictions, especially the prediction of time series, is one of the most critical parts of the decision-making processes and management in different areas and organizations. One of the best solutions to achieve high accuracy and low computational costs in time series forecasting is to develop and use efficient hybrid methods. Among the combined methods, parallel hybrid approaches are more welcomed by scholars and often have better performance than sequence ones. However, the necessary condition of using parallel combinational approaches is to estimate the appropriate weight of components. This weighting stage of parallel hybrid models is the most effective factor in forecasting accuracy as well as computational costs. In the literature, meta-heuristic algorithms have often been applied to weight components of parallel hybrid models. However, such that algorithms, despite all unique advantages, have two serious disadvantages of local optima and iterative time-consuming optimization processes. The purpose of this paper is to develop a linear optimal weighting estimator (LOWE) algorithm for finding the desired weight of components in the global non-iterative universal manner.

Design/methodology/approach

In this paper, a LOWE algorithm is developed to find the desired weight of components in the global non-iterative universal manner.

Findings

Empirical results indicate that the accuracy of the LOWE-based parallel hybrid model is significantly better than meta-heuristic and simple average (SA) based models. The proposed weighting approach can improve 13/96%, 11/64%, 9/35%, 25/05% the performance of the differential evolution (DE), genetic algorithm (GA), particle swarm optimization (PSO) and SA-based parallel hybrid models in electricity load forecasting. While, its computational costs are considerably lower than GA, PSO and DE-based parallel hybrid models. Therefore, it can be considered as an appropriate and effective alternative weighing technique for efficient parallel hybridization for time series forecasting.

Originality/value

In this paper, a LOWE algorithm is developed to find the desired weight of components in the global non-iterative universal manner. Although it can be generally demonstrated that the performance of the proposed weighting technique will not be worse than the meta-heuristic algorithm, its performance is also practically evaluated in real-world data sets.

Article
Publication date: 15 August 2023

Yi-Chung Hu

Tourism demand forecasting is vital for the airline industry and tourism sector. Combination forecasting has the advantage of fusing several forecasts to reduce the risk of…

Abstract

Purpose

Tourism demand forecasting is vital for the airline industry and tourism sector. Combination forecasting has the advantage of fusing several forecasts to reduce the risk of inappropriate model selection for analyzing decisions. This paper investigated the effects of a time-varying weighting strategy on the performance of linear and nonlinear forecast combinations in the context of tourism.

Design/methodology/approach

This study used grey prediction models, which did not require that the available data satisfy statistical assumptions, to generate forecasts. A quality-control technique was applied to determine when to change the combination weights to generate combined forecasts by using linear and nonlinear methods.

Findings

The empirical results showed that except for when the Choquet fuzzy integral was used, forecast combination with time-varying weights did not significantly outperform that with fixed weights. The Choquet integral with time-varying weights significantly outperformed that with fixed weights for all model combinations, and had a superior forecasting accuracy to those of other combination methods.

Practical implications

The tourism sector can benefit from the use of the Choquet integral with time-varying weights, by using it to formulate suitable strategies for tourist destinations.

Originality/value

Combining forecasts with time-varying weights may improve the accuracy of the predictions. This study investigated incorporating a time-varying weighting strategy into combination forecasting by using CUSUM. The results verified the effectiveness of the time-varying Choquet integral for tourism forecast combination.

Details

Grey Systems: Theory and Application, vol. 13 no. 4
Type: Research Article
ISSN: 2043-9377

Keywords

1 – 10 of over 86000