Search results
1 – 10 of over 135000Alexander Binun, Bracha Shapira and Yuval Elovici
The purpose of this paper is to present an extension to a framework based on the information structure (IS) model for combining information filtering (IF) results. The main goal…
Abstract
Purpose
The purpose of this paper is to present an extension to a framework based on the information structure (IS) model for combining information filtering (IF) results. The main goal of the framework is to combine the results of the different IF systems so as to maximise the expected payoff (EP) to the user. In this paper we compare three different approaches to tuning the relevance thresholds of individual IF systems that are being combined in order to maximise the EP to the user. In the first approach we set the same threshold for each of the IF systems. In the second approach the threshold of each IF system is tuned independently to maximise its own EP (“local optimisation”). In the third approach the thresholds of the IF systems are jointly tuned to maximise the EP of the combined system (“global optimisation”).
Design/methodology/approach
An empirical evaluation is conducted to examine the performance of each approach using two IF systems based on somewhat different filtering algorithms (TFIDF, OKAPI). Experiments are run using the TREC3, TREC6, and TREC7 test collections.
Findings
The experiments reveal that, as expected, the third approach always outperforms the first and the second, and that for some user profiles, the difference is significant. However, operational goals argue against global optimisation, and the costs of meeting these operational goals are discussed.
Research limitations/implications
One limitation is the assumption of independence of the IF systems: in real life systems usually use similar algorithms, so dependency might occur. The approach also tends to be examined with the assumption of dependency between systems.
Practical implications
The main practical implications of this study lie in the empirical proof that combination of filtering systems improves filtering results and the finding about the optimal combination methods for the different user profiles. Many filtering applications exist (e.g. spam filters, news personalisation systems, etc.) that can benefit from these findings.
Originality/value
The study presents and compares the contribution of three different combination methods of filtering systems to the improvement of filtering results It empirically shows the benefits of each method and draws important conclusions about the combination of filtering systems.
Details
Keywords
Mandeep Kaur Sidhu, Kanwarpreet Singh and Doordarshi Singh
The purpose of this paper is to evaluate the capabilities of total quality management (TQM) and supply chain management (SCM) and extract various significant factors which…
Abstract
Purpose
The purpose of this paper is to evaluate the capabilities of total quality management (TQM) and supply chain management (SCM) and extract various significant factors which influence the implementation of SCM alone and synergy of both TQM–SCM in terms of business performance of Indian medium and large scale manufacturing industry.
Design/methodology/approach
In the present study, 116 Indian manufacturing organizations have been extensively surveyed to ascertain the inter-relationships between various success factors and competitive dimensions of SCM alone and for combined approach (TQM–SCM), through different statistical techniques. Further, to evaluate the significance of time period on competitive dimensions, two-tailed t-test has been deployed. Finally the discriminant validity test has been applied to extract highly successful and moderately successful organizations for both approaches.
Findings
The study compares the contributions played by only SCM initiatives and combined approach (TQM–SCM) initiatives toward realization of significant improvements of various competitive dimensions of Indian manufacturing organizations. Finally, this study reveals that synergistic relationship of TQM and SCM paradigms can be more helpful as compared to only SCM initiatives for Indian manufacturing industries to enhance overall business performance.
Originality/value
TQM and SCM are considered as performance improvement techniques by the manufacturing organizations. The present research work establishes that combined (TQM–SCM) initiatives have effectively contributed for realization of significant competitive dimensions, progressively from introduction to maturity phases. So, the study stresses upon the need for improving coordination between various manufacturing parameters as well as competitive dimensions of TQM and SCM paradigms to enjoy higher potential of business performance.
Details
Keywords
– This purpose of this paper is to investigate how to implement a combined assurance program.
Abstract
Purpose
This purpose of this paper is to investigate how to implement a combined assurance program.
Design/methodology/approach
This paper uses qualitative data obtained through semi-structured interviews with six multinationals at different stages of combined assurance implementation maturity.
Findings
The paper finds that organizations are still learning through combined assurance implementation because no organization seems to have attained a mature combined assurance program. Nevertheless, our descriptive findings reveal that a successful combined assurance implementation follows six important components.
Research limitations/implications
One limitation of this study is that, as the organizations studied are at different stages of combined assurance program implementation, data may have comparability issues. Another limitation is that different interviewees were studied from one case to another.
Practical implications
The results have implications both for organizations that do not yet have a combined assurance program in place and for those currently at the implementation stage. It has also implications for chief audit executives who are good candidates to lead a combined assurance implementation and for regulators, as the study describes combined assurance as an important accountability mechanism that helps boards and audit committees exercise their oversight role properly.
Originality/value
The study is the first to address combined assurance implementation. It complements the study of the Institute of Internal Auditors UK and Ireland (2010), which identifies the reasons for failed attempts to coordinate assurance activities, by illustrating combined assurance implementation through six international case studies of organizations at different combined assurance implementation stages.
Details
Keywords
TR Sreeram and Asokan Thondiyath
The purpose of this paper is to present a combined framework for system design using Six Sigma and Lean concepts. Systems Engineering has evolved independently and there are…
Abstract
Purpose
The purpose of this paper is to present a combined framework for system design using Six Sigma and Lean concepts. Systems Engineering has evolved independently and there are numerous tools and techniques available to address issues that may arise in the design of systems. In the context of systems design, the application of Six Sigma and Lean concepts results in a flexible and adaptable framework. A combined framework is presented here that allows better visualization of the system-level components and their interactions at parametric level, and it also illuminates gaps that make way for continuous improvement. The Deming’s Plan-Do-Check-Act is the basis of this framework. Three case studies are presented to evaluate the application of this framework in the context of Systems Engineering design. The paper concludes with a summary of advantages of using a combined framework, its limitations and scope for future work.
Design/methodology/approach
Six Sigma, Lean and Systems Engineering approaches combined into a framework for collaborative product development.
Findings
The present framework is not rigid and does not attempt to force fit any tools or concepts. The framework is generic and allows flexibility through a plug and play type of implementation. This is important, as engineering change needs vary constantly to meet consumer demands. Therefore, it is important to engrain flexibility in the development of a foundational framework for design-encapsulating improvements and innovation. From a sustainability perspective, it is important to develop techniques that drive rationality in the decisions, especially during tradeoffs and conflicts.
Research limitations/implications
Scalability of the approach for large systems where complex interactions exist. Besides, the application of negotiation techniques for more than three persons poses a challenge from a mathematical context. Future research should address these in the context of systems design using Six Sigma and Lean techniques.
Practical implications
This paper provides a flexible framework for combining the three techniques based on Six Sigma, Lean and Systems Engineering.
Social implications
This paper will influence the construction of agent-based systems, particularly the ones using the Habermas’s theory of social action as the basis for product development.
Originality/value
This paper has not been published in any other journal or conference.
Details
Keywords
Gebeyehu Belay Gebremeskel, Chai Yi, Zhongshi He and Dawit Haile
Among the growing number of data mining (DM) techniques, outlier detection has gained importance in many applications and also attracted much attention in recent times. In the…
Abstract
Purpose
Among the growing number of data mining (DM) techniques, outlier detection has gained importance in many applications and also attracted much attention in recent times. In the past, outlier detection researched papers appeared in a safety care that can view as searching for the needles in the haystack. However, outliers are not always erroneous. Therefore, the purpose of this paper is to investigate the role of outliers in healthcare services in general and patient safety care, in particular.
Design/methodology/approach
It is a combined DM (clustering and the nearest neighbor) technique for outliers’ detection, which provides a clear understanding and meaningful insights to visualize the data behaviors for healthcare safety. The outcomes or the knowledge implicit is vitally essential to a proper clinical decision-making process. The method is important to the semantic, and the novel tactic of patients’ events and situations prove that play a significant role in the process of patient care safety and medications.
Findings
The outcomes of the paper is discussing a novel and integrated methodology, which can be inferring for different biological data analysis. It is discussed as integrated DM techniques to optimize its performance in the field of health and medical science. It is an integrated method of outliers detection that can be extending for searching valuable information and knowledge implicit based on selected patient factors. Based on these facts, outliers are detected as clusters and point events, and novel ideas proposed to empower clinical services in consideration of customers’ satisfactions. It is also essential to be a baseline for further healthcare strategic development and research works.
Research limitations/implications
This paper mainly focussed on outliers detections. Outlier isolation that are essential to investigate the reason how it happened and communications how to mitigate it did not touch. Therefore, the research can be extended more about the hierarchy of patient problems.
Originality/value
DM is a dynamic and successful gateway for discovering useful knowledge for enhancing healthcare performances and patient safety. Clinical data based outlier detection is a basic task to achieve healthcare strategy. Therefore, in this paper, the authors focussed on combined DM techniques for a deep analysis of clinical data, which provide an optimal level of clinical decision-making processes. Proper clinical decisions can obtain in terms of attributes selections that important to know the influential factors or parameters of healthcare services. Therefore, using integrated clustering and nearest neighbors techniques give more acceptable searched such complex data outliers, which could be fundamental to further analysis of healthcare and patient safety situational analysis.
Details
Keywords
Chedia Dhaoui, Cynthia M. Webster and Lay Peng Tan
With the soaring volumes of brand-related social media conversations, digital marketers have extensive opportunities to track and analyse consumers’ feelings and opinions about…
Abstract
Purpose
With the soaring volumes of brand-related social media conversations, digital marketers have extensive opportunities to track and analyse consumers’ feelings and opinions about brands, products or services embedded within consumer-generated content (CGC). These “Big Data” opportunities render manual approaches to sentiment analysis impractical and raise the need to develop automated tools to analyse consumer sentiment expressed in text format. This paper aims to evaluate and compare the performance of two prominent approaches to automated sentiment analysis applied to CGC on social media and explores the benefits of combining them.
Design/methodology/approach
A sample of 850 consumer comments from 83 Facebook brand pages are used to test and compare lexicon-based and machine learning approaches to sentiment analysis, as well as their combination, using the LIWC2015 lexicon and RTextTools machine learning package.
Findings
Results show the two approaches are similar in accuracy, both achieving higher accuracy when classifying positive sentiment than negative sentiment. However, they differ substantially in their classification ensembles. The combined approach demonstrates significantly improved performance in classifying positive sentiment.
Research limitations/implications
Further research is required to improve the accuracy of negative sentiment classification. The combined approach needs to be applied to other kinds of CGCs on social media such as tweets.
Practical implications
The findings inform decision-making around which sentiment analysis approaches (or a combination thereof) is best to analyse CGC on social media.
Originality/value
This study combines two sentiment analysis approaches and demonstrates significantly improved performance.
Details
Keywords
Brent Wenerstrom and Mehmed Kantardzic
Search engine users are faced with long lists of search results, each entry being of a varying degree of relevance. Often users' expectations based on the short text of a search…
Abstract
Purpose
Search engine users are faced with long lists of search results, each entry being of a varying degree of relevance. Often users' expectations based on the short text of a search result hold false expectations about the linked web page. This leads users to skip relevant information, missing valuable insights, and click on irrelevant web pages wasting time. The purpose of this paper is to propose a new summary generation technique, ReClose, which combines query‐independent and query‐biased summary techniques to improve the accuracy of users' expectations.
Design/methodology/approach
The authors tested the effectiveness of ReClose summaries against Google summaries by surveying 34 participants. Participants were randomly assigned to use one type of summary approach. Summary effectiveness was judged based on the accuracy of each user's expectations.
Findings
It was found that individuals using ReClose summaries showed a 10 per cent increase in the expectation accuracy over individuals using Google summaries, and therefore better user satisfaction.
Practical implications
The survey demonstrates the effectiveness of using ReClose summaries to improve the accuracy of user expectations.
Originality/value
This paper presents a novel summary generation technique called ReClose, a new approach to summary evaluation and improvements upon previously proposed summary generation techniques.
Details
Keywords
This paper seeks to examine image retrieval within two different contexts: a monolingual context where the language of the query is the same as the indexing language and a…
Abstract
Purpose
This paper seeks to examine image retrieval within two different contexts: a monolingual context where the language of the query is the same as the indexing language and a multilingual context where the language of the query is different from the indexing language. The study also aims to compare two different approaches for the indexing of ordinary images representing common objects: traditional image indexing with the use of a controlled vocabulary and free image indexing using uncontrolled vocabulary.
Design/methodology/approach
This research uses three data collection methods. An analysis of the indexing terms was employed in order to examine the multiplicity of term types assigned to images. A simulation of the retrieval process involving a set of 30 images was performed with 60 participants. The quantification of the retrieval performance of each indexing approach was based on the usability measures, that is, effectiveness, efficiency and satisfaction of the user. Finally, a questionnaire was used to gather information on searcher satisfaction during and after the retrieval process.
Findings
The results of this research are twofold. The analysis of indexing terms associated with all the 3,950 images provides a comprehensive description of the characteristics of the four non‐combined indexing forms used for the study. Also, the retrieval simulation results offers information about the relative performance of the six indexing forms (combined and non‐combined) in terms of their effectiveness, efficiency (temporal and human) and the image searcher's satisfaction.
Originality/value
The findings of the study suggest that, in the near future, the information systems could benefit from allowing an increased coexistence of controlled vocabularies and uncontrolled vocabularies, resulting from collaborative image tagging, for example, and giving the users the possibility to dynamically participate in the image‐indexing process, in a more user‐centred way.
Details
Keywords
Merlin Sajini M.L., Suja S. and Merlin Gilbert Raj S.
The purpose of the study is distributed generation planning in a radial delivery framework to identify an appropriate location with a suitable rating of DG units energized by…
Abstract
Purpose
The purpose of the study is distributed generation planning in a radial delivery framework to identify an appropriate location with a suitable rating of DG units energized by renewable energy resources to scale back the power loss and to recover the voltage levels. Though several algorithms have already been proposed through the target of power loss reduction and voltage stability enhancement, further optimization of the objectives is improved by using a combination of heuristic algorithms like DE and particle swarm optimization (PSO).
Design/methodology/approach
The identification of the candidate buses for the location of DG units and optimal rating of DG units is found by a combined differential evolution (DE) and PSO algorithm. In the combined strategy of DE and PSO, the key merits of both algorithms are combined. The DE algorithm prevents the individuals from getting trapped into the local optimum, thereby providing efficient global optimization. At the same time, PSO provides a fast convergence rate by providing the best particle among the entire iteration to obtain the best fitness value.
Findings
The proposed DE-PSO takes advantage of the global optimization of DE and the convergence rate of PSO. The different case studies of multiple DG types are carried out for the suggested procedure for the 33- and 69-bus radial delivery frameworks and a real 16-bus distribution substation in Tamil Nadu to show the effectiveness of the proposed methodology and distribution system performance. From the obtained results, there is a substantial decrease in the power loss and an improvement of voltage levels across all the buses of the system, thereby maintaining the distribution system within the framework of system operation and safety constraints.
Originality/value
A comparison of an equivalent system with the DE, PSO algorithm when used separately and other algorithms available in literature shows that the proposed method results in an improved performance in terms of the convergence rate and objective function values. Finally, an economic benefit analysis is performed if a photo-voltaic based DG unit is allocated in the considered test systems.
Details
Keywords
Souheila Ben Guirat, Ibrahim Bounhas and Yahya Slimani
The semantic relations between Arabic word representations were recognized and widely studied in theoretical studies in linguistics many centuries ago. Nonetheless, most of the…
Abstract
Purpose
The semantic relations between Arabic word representations were recognized and widely studied in theoretical studies in linguistics many centuries ago. Nonetheless, most of the previous research in automatic information retrieval (IR) focused on stem or root-based indexing, while lemmas and patterns are under-exploited. However, the authors believe that each of the four morphological levels encapsulates a part of the meaning of words. That is, the purpose is to aggregate these levels using more sophisticated approaches to reach the optimal combination which enhances IR.
Design/methodology/approach
The authors first compare the state-of-the art Arabic natural language processing (NLP) tools in IR. This allows to select the most accurate tool in each representation level i.e. developing four basic IR systems. Then, the authors compare two rank aggregation approaches which combine the results of these systems. The first approach is based on linear combination, while the second exploits classification-based meta-search.
Findings
Combining different word representation levels, consistently and significantly enhances IR results. The proposed classification-based approach outperforms linear combination and all the basic systems.
Research limitations/implications
The work stands by a standard experimental comparative study which assesses several NLP tools and combining approaches on different test collections and IR models. Thus, it may be helpful for future research works to choose the most suitable tools and develop more sophisticated methods for handling the complexity of Arabic language.
Originality/value
The originality of the idea is to consider that the richness of Arabic is an exploitable characteristic and no more a challenging limit. Thus, the authors combine 4 different morphological levels for the first time in Arabic IR. This approach widely overtook previous research results.
Peer review
The peer review history for this article is available at: https://publons.com/publon/10.1108/OIR-11-2020-0515
Details