Search results

1 – 10 of over 88000
Article
Publication date: 3 December 2018

Selcuk Cebi and Cengiz Kahraman

The purpose of this paper is to propose a novel weighting algorithm for fuzzy information axiom (IA) and to apply it to the evaluation process of 3D printers.

Abstract

Purpose

The purpose of this paper is to propose a novel weighting algorithm for fuzzy information axiom (IA) and to apply it to the evaluation process of 3D printers.

Design/methodology/approach

As a decision-making tool, IA method is presented to evaluate the performance of any design. Then, weighted IA methods are investigated and a new weighting procedure is introduced to the literature. Then, the existing axiomatic design methods and the proposed new method are classified into two groups: weighting based on information content and weighting based on design ranges. The weighting based on information content approach consists of four methods including pessimistic and optimistic approaches. The philosophy of the weighting based on design ranges is to narrow design ranges in order to decrease fuzziness in the model. To prove the robustness and the performance of the proposed weighting method, the results are compared with the existing methods in the literature. Then, the new approach is applied to evaluate 3D printers.

Findings

The results of the proposed study show that the proposed weighting algorithm has better performance than the old ones for IA. Therefore, the proposed weighting algorithm should be used for the weighting tool of IA thereafter.

Originality/value

An effective weighting method compatible with the philosophy of IA method has been proposed. Furthermore, the performances of 3D printers are compared by using the proposed method.

Details

Journal of Enterprise Information Management, vol. 32 no. 1
Type: Research Article
ISSN: 1741-0398

Keywords

Article
Publication date: 19 April 2011

Shihchieh Chou, Chinyi Cheng and Szujui Huang

The purpose of this paper is to establish a new approach for solving the expansion term problem.

Abstract

Purpose

The purpose of this paper is to establish a new approach for solving the expansion term problem.

Design/methodology/approach

This study develops an expansion term weighting function derived from the valuable concepts used by previous approaches. These concepts include probability measurement, adjustment according to situations, and summation of weights. Formal tests have been conducted to compare the proposed weighting function with the baseline ranking model and other weighting functions.

Findings

The results reveal stable performance by the proposed expansion term weighting function. It proves more effective than the baseline ranking model and outperforms other weighting functions.

Research limitations/implications

The paper finds that testing additional data sets and potential applications to real working situations is required before the generalisability and superiority of the proposed expansion term weighting function can be asserted.

Originality/value

Stable performance and an acceptable level of effectiveness for the proposed expansion term weighting function indicate the potential for further study and development of this approach. This would add to the current methods studied by the information retrieval community for culling information from documents.

Details

Online Information Review, vol. 35 no. 2
Type: Research Article
ISSN: 1468-4527

Keywords

Book part
Publication date: 26 February 2016

Noel Cassar and Simon Grima

The recent development of the European debt sovereign crisis showed that sovereign debt is not “risk free.” The traditional index bond management used during the last two decades…

Abstract

Introduction

The recent development of the European debt sovereign crisis showed that sovereign debt is not “risk free.” The traditional index bond management used during the last two decades such as the market-capitalization weighting scheme has been severely called into question. In order to overcome these drawbacks, alternative weighting schemes have recently prompted attention, both from academic researchers and from market practitioners. One of the key developments was the introduction of passive funds using economic fundamental indicators.

Purpose

In this chapter, the authors introduced models with economic drivers with an aim of investigating whether the fundamental approaches outperformed the other models on risk-adjusted returns and on other terms.

Methodology

The authors did this by constructing five portfolios composed of the Eurozone sovereigns bonds. The models are the Market-Capitalization RP, GDP model RP, Ratings RP model, Fundamental-Ranking RP, and Fundamental-Weighted RP models. These models were created exclusively for this chapter. Both Fundamental models are using a range of 10 country fundamentals. A variation from other studies is that this dissertation applied the risk parity concept which is an allocation technique that aims to equalize risk across different assets. This concept has been applied by assuming the credit default swap as proxy for sovereign credit risk. The models were run using the Generalized Reduced Gradient (GRG) method as the optimization model, together with the Lagrange Multipliers as techniques and the Karush–Kuhn–Tucker conditions. This led to the comparison of all the models mentioned above in terms of performance, risk-adjusted returns, concentration, and weighted average ratings.

Findings

By analyzing the whole period between 2006 and 2014, it was found that both the fundamental models gave very appealing results in terms of risk-adjusted returns. The best results were returned by the Fundamental-Ranking RP model followed by the Fundamental-Weighting RP model. However, better results for the mixed performance and risk-adjusted returns were achieved on a yearly basis and when sub-dividing the whole period in three equal periods. Moreover, the authors concluded that over the long term, the fundamental bond indexing triumphed over the other approaches by offering superior return and risk characteristics. Thus, one can use the fundamental indexation as an alternative to other traditional models.

Details

Contemporary Issues in Bank Financial Management
Type: Book
ISBN: 978-1-78635-000-8

Keywords

Article
Publication date: 5 October 2012

Sangwon Park and Daniel R. Fesenmaier

The purpose of this study is to estimate the extent (mean and range) of non‐response bias in online travel advertising conversion studies for 24 destinations located throughout…

Abstract

Purpose

The purpose of this study is to estimate the extent (mean and range) of non‐response bias in online travel advertising conversion studies for 24 destinations located throughout the USA.

Design/methodology/approach

The method uses two weighting procedures (i.e. post stratification and propensity score weighting) to estimate the extent of non‐response bias by adjusting the estimates provided by respondents to more closely represent the total target sample.

Findings

The results of this analysis clearly indicate that the use of unweighted data to estimate advertising effectiveness may lead to substantial over estimation of conversion rates, but there is limited “bias” in the estimates of median visitor expenditures. The analyses also indicate that weighting systems have substantially different impact on the estimates of conversion rates.

Research limitations/implications

First, the likelihood to answer a survey varies substantially depending on the degree of the familiarity with the mode (i.e. paper, telephone versus internet). Second, the competition‐related variables (i.e. the number and competitiveness of alternative nearby destinations) and various aspects of the campaign (i.e. amount of investment in a location) should be considered.

Originality/value

This study of 24 different American tourism campaigns provides a useful understanding in the nature (mean and range) of impact of non‐response bias in tourism advertising conversion studies. Additionally, where there is difficulty obtaining a reference survey in the advertising study, the two weighting methods used in this study are shown to be useful for assessing the errors in response data, especially in the case of propensity score weighting, where the means to develop multivariate‐based weights is straightforward.

Details

International Journal of Culture, Tourism and Hospitality Research, vol. 6 no. 4
Type: Research Article
ISSN: 1750-6182

Keywords

Article
Publication date: 22 September 2021

Fatemeh Chahkotahi and Mehdi Khashei

Improving the accuracy and reducing computational costs of predictions, especially the prediction of time series, is one of the most critical parts of the decision-making…

Abstract

Purpose

Improving the accuracy and reducing computational costs of predictions, especially the prediction of time series, is one of the most critical parts of the decision-making processes and management in different areas and organizations. One of the best solutions to achieve high accuracy and low computational costs in time series forecasting is to develop and use efficient hybrid methods. Among the combined methods, parallel hybrid approaches are more welcomed by scholars and often have better performance than sequence ones. However, the necessary condition of using parallel combinational approaches is to estimate the appropriate weight of components. This weighting stage of parallel hybrid models is the most effective factor in forecasting accuracy as well as computational costs. In the literature, meta-heuristic algorithms have often been applied to weight components of parallel hybrid models. However, such that algorithms, despite all unique advantages, have two serious disadvantages of local optima and iterative time-consuming optimization processes. The purpose of this paper is to develop a linear optimal weighting estimator (LOWE) algorithm for finding the desired weight of components in the global non-iterative universal manner.

Design/methodology/approach

In this paper, a LOWE algorithm is developed to find the desired weight of components in the global non-iterative universal manner.

Findings

Empirical results indicate that the accuracy of the LOWE-based parallel hybrid model is significantly better than meta-heuristic and simple average (SA) based models. The proposed weighting approach can improve 13/96%, 11/64%, 9/35%, 25/05% the performance of the differential evolution (DE), genetic algorithm (GA), particle swarm optimization (PSO) and SA-based parallel hybrid models in electricity load forecasting. While, its computational costs are considerably lower than GA, PSO and DE-based parallel hybrid models. Therefore, it can be considered as an appropriate and effective alternative weighing technique for efficient parallel hybridization for time series forecasting.

Originality/value

In this paper, a LOWE algorithm is developed to find the desired weight of components in the global non-iterative universal manner. Although it can be generally demonstrated that the performance of the proposed weighting technique will not be worse than the meta-heuristic algorithm, its performance is also practically evaluated in real-world data sets.

Article
Publication date: 9 December 2020

Fatma Pakdil, Pelin Toktaş and Gülin Feryal Can

The purpose of this study is to develop a methodology in which alternate Six Sigma projects are prioritized and selected using appropriate multi-criteria decision-making (MCDM…

Abstract

Purpose

The purpose of this study is to develop a methodology in which alternate Six Sigma projects are prioritized and selected using appropriate multi-criteria decision-making (MCDM) methods in healthcare organizations. This study addresses a particular gap in implementing a systematic methodology for Six Sigma project prioritization and selection in the healthcare industry.

Design/methodology/approach

This study develops a methodology in which alternate Six Sigma projects are prioritized and selected using a modified Kemeny median indicator rank accordance (KEMIRA-M), an MCDM method based on a case study in healthcare organizations. The case study was hypothetically developed in the healthcare industry and presented to demonstrate the proposed framework’s applicability and validity for future decision-makers who will take place in Six Sigma project selection processes.

Findings

The study reveals that the Six Sigma project prioritized by KEMIRA-M assign the highest ranks to patient satisfaction, revenue enhancement and sigma level benefit criteria, while resource utilization and process cycle time receive the lowest rank.

Practical implications

The methodology developed in this paper proposes an MCDM-based approach for practitioners to prioritize and select Six Sigma projects in the healthcare industry. The findings regarding patient satisfaction and revenue enhancement mesh with the current trends that dominate and regulate the industry. KEMIRA-M provides flexibility for Six Sigma project selection and uses multiple criteria in two-criteria groups, simultaneously. In this study, a more objective KEMIRA-M method was suggested by implementing two different ranking-based weighting approaches.

Originality/value

This is the first study that implements KEMIRA-M in Six Sigma project prioritization and selection process in the healthcare industry. To overcome previous KEMIRA-M shortcomings, two ranking based weighting approaches were proposed to form a weighting procedure of KEMIRA-M. As the first implementation of the KEMIRA-M weighting procedure, the criteria weighting procedure of the KEMIRA-M method was developed using two different weighting methods based on ranking. The study provides decision-makers with a methodology that considers both benefit and cost type criteria for alternates and gives importance to experts’ rankings related to criteria and the performance values of alternates for criteria.

Details

International Journal of Lean Six Sigma, vol. 12 no. 3
Type: Research Article
ISSN: 2040-4166

Keywords

Article
Publication date: 1 March 2005

Andreas Jobst

This paper provides a comprehensive overview of the gradual evolution of the supervisory policy adopted by the Basel Committee for the regulatory treatment of asset…

1329

Abstract

This paper provides a comprehensive overview of the gradual evolution of the supervisory policy adopted by the Basel Committee for the regulatory treatment of asset securitisation. The pathology of the new “securitisation framework” is carefully highlighted to facilitate a general understanding of what constitutes the current state of computing adequate capital requirements for securitised credit exposures. Although a simplified sensitivity analysis of the varying levels of capital charges depending on the security design of asset securitisation transactions is incorporated, the author does not engage in a profound analysis of the benefits and drawbacks implicated in the new securitisation framework.

Details

Journal of Financial Regulation and Compliance, vol. 13 no. 1
Type: Research Article
ISSN: 1358-1988

Keywords

Article
Publication date: 18 May 2021

Güler Aras and Filiz Mutlu Yıldırım

In integrated reporting, financial and non-financial performance is presented interactively, as the value creation abilities of corporations are shaped via capitals, the…

Abstract

Purpose

In integrated reporting, financial and non-financial performance is presented interactively, as the value creation abilities of corporations are shaped via capitals, the importance of the topic increases day by day. In addition to this, differentiation of importance of basic and sub-dimensions representing capitals between institutions leads to questions on which weight these should take place. From this point, this paper aims to develop capitals in integrated reporting and to weight the indicators representing them.

Design/methodology/approach

In this study, first, to ensure that each component of capital is included in integrated reporting, governance capital has been added to capitals, which are identified in the international integrated reporting framework (the framework). Then, weights of each capital dimension and indicators within these dimensions have been determined in a banking sector example with the entropy method.

Findings

Including the 2014-2017 period, an efficient weight assessment approach with the entropy method has been presented and it was observed that the most weighted element is the intellectual capital.

Research limitations/implications

The limitations of this study are the lack of an agreed general indicator framework for indicators representing multiple capitals in integrated reporting, each bank’s data disclosure of different indicators and differentiation of the shared data between sources.

Practical implications

This study guides the weighting studies necessary for integrated performance measurement.

Social implications

It is foreseen that this study will be effective in the development of integrated thinking and this effect will contribute to the overall functioning of all sectors beyond the banking sector, which is the application area of the study.

Originality/value

The study is the first original study in the literature in terms of providing a new dimension by adding the governance capital to the capitals defined in the Framework.

Details

Social Responsibility Journal, vol. 18 no. 3
Type: Research Article
ISSN: 1747-1117

Keywords

Article
Publication date: 31 December 2019

Yoke Yie Chen, Nirmalie Wiratunga and Robert Lothian

Recommender system approaches such as collaborative and content-based filtering rely on user ratings and product descriptions to recommend products. More recently, recommender…

Abstract

Purpose

Recommender system approaches such as collaborative and content-based filtering rely on user ratings and product descriptions to recommend products. More recently, recommender system research has focussed on exploiting knowledge from user-generated content such as product reviews to enhance recommendation performance. The purpose of this paper is to show that the performance of a recommender system can be enhanced by integrating explicit knowledge extracted from product reviews with implicit knowledge extracted from analysis of consumer’s purchase behaviour.

Design/methodology/approach

The authors introduce a sentiment and preference-guided strategy for product recommendation by integrating not only explicit, user-generated and sentiment-rich content but also implicit knowledge gleaned from users’ product purchase preferences. Integration of both of these knowledge sources helps to model sentiment over a set of product aspects. The authors show how established dimensionality reduction and feature weighting approaches from text classification can be adopted to weight and select an optimal subset of aspects for recommendation tasks. The authors compare the proposed approach against several baseline methods as well as the state-of-the-art better method, which recommends products that are superior to a query product.

Findings

Evaluation results from seven different product categories show that aspect weighting and selection significantly improves state-of-the-art recommendation approaches.

Research limitations/implications

The proposed approach recommends products by analysing user sentiment on product aspects. Therefore, the proposed approach can be used to develop recommender systems that can explain to users why a product is recommended. This is achieved by presenting an analysis of sentiment distribution over individual aspects that describe a given product.

Originality/value

This paper describes a novel approach to integrate consumer purchase behaviour analysis and aspect-level sentiment analysis to enhance recommendation. In particular, the authors introduce the idea of aspect weighting and selection to help users identify better products. Furthermore, the authors demonstrate the practical benefits of this approach on a variety of product categories and compare the approach with the current state-of-the-art approaches.

Details

Online Information Review, vol. 44 no. 2
Type: Research Article
ISSN: 1468-4527

Keywords

Article
Publication date: 1 April 1996

ALEXANDER M. ROBERTSON and PETER WILLETT

This paper describes the development of a genetic algorithm (GA) for the assignment of weights to query terms in a ranked‐output document retrieval system. The GA involves a…

Abstract

This paper describes the development of a genetic algorithm (GA) for the assignment of weights to query terms in a ranked‐output document retrieval system. The GA involves a fitness function that is based on full relevance information, and the rankings resulting from the use of these weights are compared with the Robertson‐Sparck Jones F4 retrospective relevance weight. Extended experiments with seven document test collections show that the ga can often find weights that are slightly superior to those produced by the deterministic weighting scheme. That said, there are many cases where the two approaches give the same results, and a few cases where the F4 weights are superior to the ga weights. Since the ga has been designed to identify weights yielding the best possible level of retrospective performance, these results indicate that the F4 weights provide an excellent and practicable alternative. Evidence is presented to suggest that negative weights may play an important role in retrospective relevance weighting.

Details

Journal of Documentation, vol. 52 no. 4
Type: Research Article
ISSN: 0022-0418

1 – 10 of over 88000