Search results

1 – 10 of over 2000
Open Access
Article
Publication date: 30 September 2019

Laura Sinay, Maria Cristina Fogliatti de Sinay, Rodney William (Bill) Carter and Aurea Martins

The purpose of this paper is to critically analyze the influence of the algorithm used on scholarly search engines (Garfield’s algorithm) and propose metrics to improve it so that…

Abstract

Purpose

The purpose of this paper is to critically analyze the influence of the algorithm used on scholarly search engines (Garfield’s algorithm) and propose metrics to improve it so that science could be based on a more democratic way.

Design/methodology/approach

This paper used a snow-ball approach to collect data that allowed identifying the history and the logic behind the Garfield’s algorithm. It follows on excerpting the foundation of existing algorithm and databases of major scholarly search engine. It concluded proposing new metrics so as to surpass restraints and to democratize the scientific discourse.

Findings

This paper finds that the studied algorithm currently biases the scientific discourse toward a narrow perspective, while it should take into consideration several researchers’ characteristics. It proposes the substitution of the h-index by the number of times the scholar’s most cited work has been cited. Finally, it proposes that works in languages different than English should be included.

Research limitations/implications

The broad comprehension of any phenomena should be based on multiple perspectives; therefore, the inclusion of diverse metrics will extend the scientific discourse.

Practical implications

The improvement of the existing algorithm will increase the chances of contact among different cultures, which stimulate rapid progress on the development of knowledge.

Originality/value

The value of this paper resides in demonstrating that the algorithm used in scholarly search engines biases the development of science. If updated as proposed here, science will be unbiased and bias aware.

Details

RAUSP Management Journal, vol. 54 no. 4
Type: Research Article
ISSN: 2531-0488

Keywords

Article
Publication date: 1 January 2005

R. Obiała, B.H.V. Topping, G.M. Seed and D.E.R. Clark

This paper describes how non‐orthogonal geometric models may be transformed into orthogonal polyhedral models. The main purpose of the transformation is to obtain a geometric…

Abstract

Purpose

This paper describes how non‐orthogonal geometric models may be transformed into orthogonal polyhedral models. The main purpose of the transformation is to obtain a geometric model that is easy to describe and further modify without loss of topological information from the original model.

Design/methodology/approach

The transformation method presented in this paper is based on fuzzy logic (FL). The idea of using FL for this type of transformation was first described by Takahashi and Shimizu. This paper describes both philosophy and techniques behind the transformation method as well as its application to some example 2D and 3D models. The problem in this paper is to define a transformation technique that will change a non‐orthogonal model into a similar orthogonal model. The orthogonal model is unknown at the start of the transformation and will only be specified once the transformation is complete. The model has to satisfy certain conditions, i.e. it should be orthogonal.

Findings

The group of non‐orthogonal models that contain triangular faces such as tetrahedra or pyramids cannot be successfully recognized using this method. This algorithm fails to transform these types of problem because to do so requires modification of the structure of the model. It appears that only when the edges are divided into pieces and the sharp angles are smoothed then the method can be successfully applied. Even though the method cannot be applied to all geometric models many successful examples for 2D and 3D transformation are presented. Orthogonal models with the same topology, which make them easier to describe, are obtained.

Originality/value

This transformation makes it possible to apply simple algorithms to orthogonal models enabling the solution of complex problems usually requiring non‐orthogonal models and more complex algorithms.

Details

Engineering Computations, vol. 22 no. 1
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 16 June 2021

Umesh K. Raut and L.K. Vishwamitra

Software-define vehicular networks (SDVN) assure the direct programmability for controlling the vehicles with improved accuracy and flexibility. In this research, the resource…

123

Abstract

Purpose

Software-define vehicular networks (SDVN) assure the direct programmability for controlling the vehicles with improved accuracy and flexibility. In this research, the resource allocation strategy is focused on which the seek-and-destroy algorithm is implemented in the controller in such a way that an effective allocation of the resources is done based on the multi-objective function.

Design/methodology/approach

The purpose of this study is focuses on the resource allocation algorithm for the SDVN with the security analysis to analyse the effect of the attacks in the network. The genuine nodes in the network are granted access to the communication in the network, for which the factors such as trust, throughput, delay and packet delivery ratio are used and the algorithm used is Seek-and-Destroy optimization. Moreover, the optimal resource allocation is done using the same optimization in such a way that the network lifetime is extended.

Findings

The security analysis is undergoing in the research using the simulation of the attackers such as selective forwarding attacks, replay attacks, Sybil attacks and wormhole attacks that reveal that the replay attacks and the Sybil attacks are dangerous attacks and in future, there is a requirement for the security model, which ensures the protection against these attacks such that the network lifetime is extended for a prolonged communication. The achievement of the proposed method in the absence of the attacks is 84.8513% for the remaining nodal energy, 95.0535% for packet delivery ratio (PDR), 279.258 ms for transmission delay and 28.9572 kbps for throughput.

Originality/value

The seek-and-destroy algorithm is one of the swarm intelligence-based optimization designed based on the characteristics of the scroungers and defenders, which is completely novel in the area of optimizations. The diversification and intensification of the algorithm are perfectly balanced, leading to good convergence rates.

Details

International Journal of Pervasive Computing and Communications, vol. 19 no. 1
Type: Research Article
ISSN: 1742-7371

Keywords

Book part
Publication date: 30 September 2021

Ricarda Hammer and Tina M. Park

While technologies are often packaged as solutions to long-standing social ills, scholars of digital economies have raised the alarm that, far from liberatory, technologies often…

Abstract

While technologies are often packaged as solutions to long-standing social ills, scholars of digital economies have raised the alarm that, far from liberatory, technologies often further entrench social inequities and in fact automate structures of oppression. This literature has been revelatory but tends to replicate a methodological nationalism that erases global racial hierarchies. We argue that digital economies rely on colonial pathways and in turn serve to replicate a racialized and neocolonial world order. To make this case, we draw on W.E.B. Du Bois' writings on capitalism's historical development through colonization and the global color line. Drawing specifically on The World and Africa as a global historical framework of racism, we develop heuristics that make visible how colonial logics operated historically and continue to this day, thus embedding digital economies in this longer history of capitalism, colonialism, and racism. Applying a Du Boisian framework to the production and propagation of digital technologies shows how the development of such technology not only relies on preexisting racial colonial production pathways and the denial of racially and colonially rooted exploitation but also replicates these global structures further.

Details

Global Historical Sociology of Race and Racism
Type: Book
ISBN: 978-1-80117-219-6

Keywords

Book part
Publication date: 23 September 2022

Thomas Gegenhuber, Danielle Logue, C.R. (Bob) Hinings and Michael Barrett

Undoubtedly, digital transformation is permeating all domains of business and society. We envisage this volume as an opportunity to explore how manifestations of digital…

Abstract

Undoubtedly, digital transformation is permeating all domains of business and society. We envisage this volume as an opportunity to explore how manifestations of digital transformation require rethinking of our understanding and theorization of institutional processes. To achieve this goal, a collaborative forum of organization and management theory scholars and information systems researchers was developed to enrich and advance institutional theory approaches in understanding digital transformation. This volume’s contributions advance the three institutional perspectives. The first perspective, institutional logics, technological affordances and digital transformation, seeks to deepen our understanding of the pervasive and increasingly important relationship between technology and institutions. The second perspective, digital transformation, professional projects and new institutional agents, explores how existing professions respond to the introduction of digital technologies as well as the emergence of new professional projects and institutional agents in the wake of digital transformation. The third perspective, institutional infrastructure, field governance and digital transformation, inquires how new digital organizational forms, such as platforms, affect institutional fields, their infrastructure and thus their governance. For each of these perspectives, we outline an agenda for future research, complemented by a brief discussion of new research frontiers (i.e., digital work and sites of technological (re-)production; artificial intelligence (AI) and actorhood; digital transformation and grand challenges) and methodological reflections.

Details

Digital Transformation and Institutional Theory
Type: Book
ISBN: 978-1-80262-222-5

Keywords

Book part
Publication date: 5 October 2018

Nima Gerami Seresht, Rodolfo Lourenzutti, Ahmad Salah and Aminah Robinson Fayek

Due to the increasing size and complexity of construction projects, construction engineering and management involves the coordination of many complex and dynamic processes and…

Abstract

Due to the increasing size and complexity of construction projects, construction engineering and management involves the coordination of many complex and dynamic processes and relies on the analysis of uncertain, imprecise and incomplete information, including subjective and linguistically expressed information. Various modelling and computing techniques have been used by construction researchers and applied to practical construction problems in order to overcome these challenges, including fuzzy hybrid techniques. Fuzzy hybrid techniques combine the human-like reasoning capabilities of fuzzy logic with the capabilities of other techniques, such as optimization, machine learning, multi-criteria decision-making (MCDM) and simulation, to capitalise on their strengths and overcome their limitations. Based on a review of construction literature, this chapter identifies the most common types of fuzzy hybrid techniques applied to construction problems and reviews selected papers in each category of fuzzy hybrid technique to illustrate their capabilities for addressing construction challenges. Finally, this chapter discusses areas for future development of fuzzy hybrid techniques that will increase their capabilities for solving construction-related problems. The contributions of this chapter are threefold: (1) the limitations of some standard techniques for solving construction problems are discussed, as are the ways that fuzzy methods have been hybridized with these techniques in order to address their limitations; (2) a review of existing applications of fuzzy hybrid techniques in construction is provided in order to illustrate the capabilities of these techniques for solving a variety of construction problems and (3) potential improvements in each category of fuzzy hybrid technique in construction are provided, as areas for future research.

Details

Fuzzy Hybrid Computing in Construction Engineering and Management
Type: Book
ISBN: 978-1-78743-868-2

Keywords

Book part
Publication date: 12 January 2021

Roger Friedland

In this paper, I compare Theodore Schatzki’s practice theory, the existential phenomenology of Martin Heidegger upon whom Schatzki drew in its formation, and my own theory of

Abstract

In this paper, I compare Theodore Schatzki’s practice theory, the existential phenomenology of Martin Heidegger upon whom Schatzki drew in its formation, and my own theory of institutional logics which I have sought to develop as a religious sociology of institution. I examine how Schatzki and I both differently locate our thinking at the level of practice. In this essay I also explore the possibility of appropriating Heidegger’s religious ontology of worldhood, which Schatzki rejects, in that project. My institutional logical position is an atheological religious one, poly-onto-teleological. Institutional logics are grounded in ultimate goods which are praiseworthy “objects” of striving and practice, signifieds to which elements of an institutional logic have a non-arbitrary relation, sources of and references for practical norms about how one should have, make, do or be that good, and a basis of knowing the world of practice as ordered around such goods. Institutional logics are constellations co-constituted by substances, not fields animated by values, interests or powers.

Because we are speaking against “values,” people are horrified at a philosophy that ostensibly dares to despise humanity’s best qualities. For what is more “logical” than that a thinking that denies values must necessarily pronounce everything valueless? Martin Heidegger, “Letter on Humanism” (2008a, p. 249).

Details

On Practice and Institution: Theorizing the Interface
Type: Book
ISBN: 978-1-80043-413-4

Keywords

Abstract

Details

Designing XR: A Rhetorical Design Perspective for the Ecology of Human+Computer Systems
Type: Book
ISBN: 978-1-80262-366-6

Article
Publication date: 1 August 1997

Cor van Dijkum

Cybernetics started when Wiener stated that not only observations but also the way the observer feeds them back into reality are part of science. Dynamic system analysis supported…

893

Abstract

Cybernetics started when Wiener stated that not only observations but also the way the observer feeds them back into reality are part of science. Dynamic system analysis supported the art and science of steering with feedback by computer modelling techniques. Cybernetics introduced the question of how self‐reference functioned in the feedback between observer and models. This led to the idea of cybernetics of the second order. Analyses the logic of feedback and how it relates to the question of how logical, mathematical and linguistic instruments can articulate scientific observations and connected theories. Uses the concept of complexity to relate cybernetics to the interdisciplinary practice of modern science. Presents the notion of “strangification” as a concept by which the transfer of knowledge from one discipline to another can be better understood and facilitated.

Details

Kybernetes, vol. 26 no. 6/7
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 29 April 2021

Mohamed Haddache, Allel Hadjali and Hamid Azzoune

The study of the skyline queries has received considerable attention from several database researchers since the end of 2000's. Skyline queries are an appropriate tool that can…

Abstract

Purpose

The study of the skyline queries has received considerable attention from several database researchers since the end of 2000's. Skyline queries are an appropriate tool that can help users to make intelligent decisions in the presence of multidimensional data when different, and often contradictory criteria are to be taken into account. Based on the concept of Pareto dominance, the skyline process extracts the most interesting (not dominated in the sense of Pareto) objects from a set of data. Skyline computation methods often lead to a set with a large size which is less informative for the end users and not easy to be exploited. The purpose of this paper is to tackle this problem, known as the large size skyline problem, and propose a solution to deal with it by applying an appropriate refining process.

Design/methodology/approach

The problem of the skyline refinement is formalized in the fuzzy formal concept analysis setting. Then, an ideal fuzzy formal concept is computed in the sense of some particular defined criteria. By leveraging the elements of this ideal concept, one can reduce the size of the computed Skyline.

Findings

An appropriate and rational solution is discussed for the problem of interest. Then, a tool, named SkyRef, is developed. Rich experiments are done using this tool on both synthetic and real datasets.

Research limitations/implications

The authors have conducted experiments on synthetic and some real datasets to show the effectiveness of the proposed approaches. However, thorough experiments on large-scale real datasets are highly desirable to show the behavior of the tool with respect to the performance and time execution criteria.

Practical implications

The tool developed SkyRef can have many domains applications that require decision-making, personalized recommendation and where the size of skyline has to be reduced. In particular, SkyRef can be used in several real-world applications such as economic, security, medicine and services.

Social implications

This work can be expected in all domains that require decision-making like hotel finder, restaurant recommender, recruitment of candidates, etc.

Originality/value

This study mixes two research fields artificial intelligence (i.e. formal concept analysis) and databases (i.e. skyline queries). The key elements of the solution proposed for the skyline refinement problem are borrowed from the fuzzy formal concept analysis which makes it clearer and rational, semantically speaking. On the other hand, this study opens the door for using the formal concept analysis and its extensions in solving other issues related to skyline queries, such as relaxation.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 14 no. 3
Type: Research Article
ISSN: 1756-378X

Keywords

1 – 10 of over 2000