Search results

1 – 10 of 87
Article
Publication date: 13 March 2017

Taiguo Qu and Zixing Cai

Isometric feature mapping (Isomap) is a very popular manifold learning method and is widely used in dimensionality reduction and data visualization. The most time-consuming step…

Abstract

Purpose

Isometric feature mapping (Isomap) is a very popular manifold learning method and is widely used in dimensionality reduction and data visualization. The most time-consuming step in Isomap is to compute the shortest paths between all pairs of data points based on a neighbourhood graph. The classical Isomap (C-Isomap) is very slow, due to the use of Floyd’s algorithm to compute the shortest paths. The purpose of this paper is to speed up Isomap.

Design/methodology/approach

Through theoretical analysis, it is found that the neighbourhood graph in Isomap is sparse. In this case, the Dijkstra’s algorithm with Fibonacci heap (Fib-Dij) is faster than Floyd’s algorithm. In this paper, an improved Isomap method based on Fib-Dij is proposed. By using Fib-Dij to replace Floyd’s algorithm, an improved Isomap method is presented in this paper.

Findings

Using the S-curve, the Swiss-roll, the Frey face database, the mixed national institute of standards and technology database of handwritten digits and a face image database, the performance of the proposed method is compared with C-Isomap, showing the consistency with C-Isomap and marked improvements in terms of the high speed. Simulations also demonstrate that Fib-Dij reduces the computation time of the shortest paths from O(N3) to O(N2lgN).

Research limitations/implications

Due to the limitations of the computer, the sizes of the data sets in this paper are all smaller than 3,000. Therefore, researchers are encouraged to test the proposed algorithm on larger data sets.

Originality/value

The new method based on Fib-Dij can greatly improve the speed of Isomap.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 10 no. 1
Type: Research Article
ISSN: 1756-378X

Keywords

Article
Publication date: 7 February 2019

Tanvir Habib Sardar and Ahmed Rimaz Faizabadi

In recent years, there is a gradual shift from sequential computing to parallel computing. Nowadays, nearly all computers are of multicore processors. To exploit the available…

2037

Abstract

Purpose

In recent years, there is a gradual shift from sequential computing to parallel computing. Nowadays, nearly all computers are of multicore processors. To exploit the available cores, parallel computing becomes necessary. It increases speed by processing huge amount of data in real time. The purpose of this paper is to parallelize a set of well-known programs using different techniques to determine best way to parallelize a program experimented.

Design/methodology/approach

A set of numeric algorithms are parallelized using hand parallelization using OpenMP and auto parallelization using Pluto tool.

Findings

The work discovers that few of the algorithms are well suited in auto parallelization using Pluto tool but many of the algorithms execute more efficiently using OpenMP hand parallelization.

Originality/value

The work provides an original work on parallelization using OpenMP programming paradigm and Pluto tool.

Details

Data Technologies and Applications, vol. 53 no. 1
Type: Research Article
ISSN: 2514-9288

Keywords

Article
Publication date: 17 October 2008

Rui Xu and Donald C. Wunsch

The purpose of this paper is to provide a review of the issues related to cluster analysis, one of the most important and primitive activities of human beings, and of the advances…

1746

Abstract

Purpose

The purpose of this paper is to provide a review of the issues related to cluster analysis, one of the most important and primitive activities of human beings, and of the advances made in recent years.

Design/methodology/approach

The paper investigates the clustering algorithms rooted in machine learning, computer science, statistics, and computational intelligence.

Findings

The paper reviews the basic issues of cluster analysis and discusses the recent advances of clustering algorithms in scalability, robustness, visualization, irregular cluster shape detection, and so on.

Originality/value

The paper presents a comprehensive and systematic survey of cluster analysis and emphasizes its recent efforts in order to meet the challenges caused by the glut of complicated data from a wide variety of communities.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 1 no. 4
Type: Research Article
ISSN: 1756-378X

Keywords

Article
Publication date: 10 April 2009

Kerui Weng and Bo Qu

The purpose of this paper is to present a model to determine which roads to be built in each stage with limited budget.

607

Abstract

Purpose

The purpose of this paper is to present a model to determine which roads to be built in each stage with limited budget.

Design/methodology/approach

A multistage network discrete expansion model with budget restriction is formulated and a heuristic algorithm is developed by the technique of comparing the original shortest paths and the sum of crossed shortest paths to avoid computing the shortest paths matrix repeatedly.

Findings

The optimal approach has a very significant effect in finding which roads to be built by the largest net benefit.

Research limitations/implications

The paper discusses a new multistage network discrete expansion problem is the main research implications.

Practical implications

The optimal choice for road building schedule of new urban district when budgets were limited.

Originality/value

The paper presents a model and an algorithm for the optimization of road building schedule based on budget restriction.

Details

Kybernetes, vol. 38 no. 3/4
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 6 March 2017

Dalibor Bartonek, Jiri Bures and Otakar Svabensky

This paper aims to deal with the formulation of the technological principle for precise positioning using global navigation satellite systems (GNSS) in railway engineering during…

Abstract

Purpose

This paper aims to deal with the formulation of the technological principle for precise positioning using global navigation satellite systems (GNSS) in railway engineering during construction and maintenance of a railway line and its spatial position. Solution of optimal route is based on finding the shortest Hamiltonian path in the graph method with additional conditions in nodes.

Design/methodology/approach

The core of the algorithm is a dynamic data structure which is based on events list. The optimization of field measurement solves the time demands and brings economic effectiveness.

Findings

The technology enables to determine the precise position with absolute difference limit from 10 to 15 mm within GNSS CZEPOS permanent network in the territory of Czech Republic.

Research limitations/implications

Technology is the result of applied research.

Practical implications

This technology innovates the current procedure of geodetic control network determination used by Railway Infrastructure Administration (state organization) in Czech Republic.

Originality/value

The event means measurement at a given track point and time for a specified duration of observation. The algorithm was realized in Borland Delphi. The optimization of field measurement solves its time demands and increases economic effectiveness. The technology enables precise position determination with absolute difference limit from 10 to 15 mm within GNSS CZEPOS permanent network in the territory of Czech Republic. It has been verified in field selected electrified and non-electrified railway lines.

Details

Engineering Computations, vol. 34 no. 1
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 6 September 2023

Antonio Llanes, Baldomero Imbernón Tudela, Manuel Curado and Jesús Soto

The authors will review the main concepts of graphs, present the implemented algorithm, as well as explain the different techniques applied to the graph, to achieve an efficient…

Abstract

Purpose

The authors will review the main concepts of graphs, present the implemented algorithm, as well as explain the different techniques applied to the graph, to achieve an efficient execution of the algorithm, both in terms of the use of multiple cores that the authors have available today, and the use of massive data parallelism through the parallelization of the algorithm, bringing the graph closer to the execution through CUDA on GPUs.

Design/methodology/approach

In this work, the authors approach the graphs isomorphism problem, approaching this problem from a point of view very little worked during all this time, the application of parallelism and the high-performance computing (HPC) techniques to the detection of isomorphism between graphs.

Findings

Results obtained give compelling reasons to ensure that more in-depth studies on the HPC techniques should be applied in these fields, since gains of up to 722x speedup are achieved in the most favorable scenarios, maintaining an average performance speedup of 454x.

Originality/value

The paper is new and original.

Details

Engineering Computations, vol. 40 no. 7/8
Type: Research Article
ISSN: 0264-4401

Keywords

Book part
Publication date: 8 May 2003

Alan Nicholson, Jan-Dirk Schmöcker, Michael G H Bell and Yasunori Iida

The objective of this paper is to give an overview of various reliability concepts that have been developed in the last decades. The paper first summarises various indicators that…

Abstract

The objective of this paper is to give an overview of various reliability concepts that have been developed in the last decades. The paper first summarises various indicators that have been developed in order to measure the reliability of a network and then looks at techniques to calculate these indicators. The usefulness and limitations of the different indicators is discussed. The paper suggests that there is no single perfect indicator but that the choice of indicator and technique depends on several factors, including the viewpoint of the analyst and the type and range of interventions being considered. In order to assess the impact of incidents the authors propose to distinguish between three types of intervention, namely “benevolent”, “neutral” or random, and “malevolent”. Also discussed is why the provision of up-to-date information to the traveller has a central role to play when trying to minimise the impact of an incident.

Details

The Network Reliability of Transport
Type: Book
ISBN: 978-0-08-044109-2

Article
Publication date: 1 August 2001

J.T.W. Damen

Current logistics systems are unable to react fast enough to rapidly changing environments, mainly because they are heavily focused on the goods handling processes. To compensate…

2580

Abstract

Current logistics systems are unable to react fast enough to rapidly changing environments, mainly because they are heavily focused on the goods handling processes. To compensate for this lack of flexibility, logistics services should be carried out by independently controlled logistics resources. But these resources together do have to guarantee the overall quality of services. Service‐controlled agile logistics solves the conflict that arises – independent resources working together – by strictly distinguishing between control and handling. It is based on control of logistics processes by the requested services themselves, which create their own “agents”, made responsible for realizing the service in the best possible way under changing circumstances. In order to examine the feasibility of this approach, a simulation program has been developed, and some preliminary results are presented.

Details

Logistics Information Management, vol. 14 no. 3
Type: Research Article
ISSN: 0957-6053

Keywords

Article
Publication date: 20 June 2023

Rehan Masood, Krishanu Roy, Vicente A. Gonzalez, James B.P. Lim and Abdur Rehman Nasir

Prefabricated construction has proven to be superior in terms of affordability and sustainability over the years. As a result of sustainable production, prefabricated…

194

Abstract

Purpose

Prefabricated construction has proven to be superior in terms of affordability and sustainability over the years. As a result of sustainable production, prefabricated housebuilding has evolved into a distinct industry reliant on supplier companies acting as supply chains (SCs) for housing projects. These companies' performance is critical to the successful implementation of prefabricated housebuilding technologies. However, in comparison to those choosing manufacturing as a strategy in other industries, the life span of these companies, providing innovative housing solutions, is relatively short. This is due to critical factors influencing the performance, but the inter-relationship of the performance dimensions is more significant. This study establishes the inter-relationship of the companies involved in house building with steel prefabricated housebuilding technologies.

Design/methodology/approach

The most recent factors were extracted from the literature. The relationships were developed using the interpretive structural modeling (ISM) method with the input from industry experts, and the driving factors were determined using the Matrice d'Impacts Croisés Multiplication Appliqués à un Classement (MICMAC) technique.

Findings

Critical performance factors were classified according to performance dimensions, ranked and classified based on driving and dependence power. The inter-relationships among the performance dimensions of time, quality, cost, delivery, features and innovation were determined. Key performance strategies were proposed for prefabricated housebuilding companies involved in manufacturing and/or assembly of steel products.

Originality/value

This study established the interrelationship of performance dimensions for prefabricated house building (PHB) companies to develop strategies against critical challenges to remain competitive in the housing market. Previous research had not looked into interrelationship among the performance dimensions. The proposed performance strategies are applicable to supplier organizations using steel prefabricated technologies in similar markets around the world.

Article
Publication date: 16 February 2024

Janina Seutter, Michelle Müller, Stefanie Müller and Dennis Kundisch

Whenever social injustice tackled by social movements receives heightened media attention, charitable crowdfunding platforms offer an opportunity to proactively advocate for…

Abstract

Purpose

Whenever social injustice tackled by social movements receives heightened media attention, charitable crowdfunding platforms offer an opportunity to proactively advocate for equality by donating money to affected people. This research examines how the Black Lives Matter movement and the associated social protest cycle after the death of George Floyd have influenced donation behavior for campaigns with a personal goal and those with a societal goal supporting the black community.

Design/methodology/approach

This paper follows a quantitative research approach by applying a quasi-experimental research design on a GoFundMe dataset. In total, 67,905 campaigns and 1,362,499 individual donations were analyzed.

Findings

We uncover a rise in donations for campaigns supporting the black community, which lasts substantially longer for campaigns with a societal than with a personal funding goal. Informed by construal level theory, we attribute this heterogeneity to changes in the level of abstractness of the problems that social movements aim to tackle.

Originality/value

This research advances the knowledge of individual donation behavior in charitable crowdfunding. Our results highlight the important role that charitable crowdfunding campaigns play in promoting social justice and anti-discrimination as part of social protest cycles.

Details

Internet Research, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1066-2243

Keywords

1 – 10 of 87