Search results

1 – 10 of over 17000
Article
Publication date: 13 November 2017

Stuti Saxena

With the progressive trends in Open Data, this paper aims to underscore the significance of Open Linked Statistical Data (OLSD) and identifies the trajectory of…

175

Abstract

Purpose

With the progressive trends in Open Data, this paper aims to underscore the significance of Open Linked Statistical Data (OLSD) and identifies the trajectory of development of OLSD besides underlining the prospects and challenges underlying OLSD.

Design/methodology/approach

Being exploratory in nature, this viewpoint seeks to present a trajectory of OLSD which seeks to emphasize upon the futuristic trend in the development of OLSD.

Findings

Eight stages have been identified in the OLSD trajectory. The opening of more and more data results in new possibilities for combining data and gaining new insights. In the future, data will automatically be opened and streamed and could be used in using OLSD algorithms. Algorithms will mention the shortcomings and limitations of data and help to interpret the data in such a way that the user is in the driver’s seat.

Research limitations/implications

While the paper follows an exploratory approach, there are a couple of implications for the practitioners and academicians. For instance, government may become more accountable with the adoption of advanced OLSD algorithms. Further research on OLSD may be required in appreciating the impact of OLSD in different settings, and this would be helpful in providing novel insights to the concerned stakeholders.

Originality/value

While Big and Open Linked Data (BOLD) has gained prominence in academic research, the focus on OLSD has remained scanty. This paper seeks to underline the futuristic trends in OLSD.

Details

The Bottom Line, vol. 30 no. 3
Type: Research Article
ISSN: 0888-045X

Keywords

Article
Publication date: 5 November 2020

Nan Zhang, Lichao Zhang, Senlin Wang, Shifeng Wen and Yusheng Shi

In the implementation of large-size additive manufacturing (AM), the large printing area can be established by using the tiled and fixed multiple printing heads or the…

Abstract

Purpose

In the implementation of large-size additive manufacturing (AM), the large printing area can be established by using the tiled and fixed multiple printing heads or the single dynamic printing head moving in the xy plane, which requires a layer decomposition after the mesh slicing to generate segmented infill areas. The data processing flow of these schemes is redundant and inefficient to some extent, especially for the processing of complex stereolithograph (STL) models. It is of great importance in improving the overall efficiency of large-size AM technics software by simplifying the redundant steps. This paper aims to address these issues.

Design/methodology/approach

In this paper, a method of directly generating segmented layered infill areas is proposed for AM. Initially, a vertices–mesh hybrid representation of STL models is constructed based on a divide-and-conquer strategy. Then, a trimming–mapping procedure is performed on sliced contours acquired from partial surfaces. Finally, to link trimmed open contours and inside-signal square corners as segmented infill areas, a region-based open contour closing algorithm is carried out in virtue of the developed data structures.

Findings

In virtue of the proposed approach, the segmented layered infill areas can be directly generated from STL models. Experimental results indicate that the approach brings us the good property of efficiency, especially for complex STL models.

Practical implications

The proposed approach can generate segmented layered infill areas efficiently in some cases.

Originality/value

The region-based layered infill area generation approach discussed here will be a supplement to current data process technologies in large-size AM, which is very suitable for parallel processing and enables us to improve the efficiency of large-size AM technics software.

Details

Rapid Prototyping Journal, vol. 27 no. 1
Type: Research Article
ISSN: 1355-2546

Keywords

Article
Publication date: 22 July 2019

Qin Qin, Jigang Huang and Jin Yao

The purpose of this paper is to enhance the accuracy as well as efficiency of high-speed machining, avoid the speed fluctuation caused by acceleration/deceleration…

Abstract

Purpose

The purpose of this paper is to enhance the accuracy as well as efficiency of high-speed machining, avoid the speed fluctuation caused by acceleration/deceleration (ACC/DEC) and increase the smoothness of feedrate in continuous corners or curves machining. The Hbot kinematic system was analyzed and combined with fused deposition modeling-based (FDM) additive manufacturing (AM) technology. Then a real-time adaptive look-ahead speed control algorithm was proposed.

Design/methodology/approach

To validate the performance of Hbot kinematic system and the proposed speed control algorithm, the positioning accuracy of Hbot and cross structure was compared. Also, the experimental verification was conducted among FDM based 3-D printer with cross structure as well as open source speed control algorithm (FDM with cross-OS), cross structure and the proposed speed control algorithm (FDM with cross-PS) and Hbot structure, as well as the proposed speed control algorithm (FDM with Hbot-PS), respectively.

Findings

The results indicate that the Hbot kinematic system leads to the high stability of positioning accuracy due to the small motion inertia. Furthermore, the experimental verification shows that the efficiency, printing precision and surface finish of models for FDM with Hbot-PS are obviously higher than that for FDM with cross-PS as well as FDM with cross-OS, while FDM with cross-OS shows the worst performance. The contribution of Hbot kinematic system and the proposed speed control algorithm to FDM based AM technology was validated by this work.

Practical implications

The Hbot kinematic system and proposed speed control algorithm have the important implication of improving the accuracy of FDM machines, especially in the low-price range segment. Also, this work can help future system developers show a possible way of tackling the motion inertia problem.

Originality/value

The study of Hbot kinematic system and proposed algorithm are expected to advise the current research for improving the accuracy as well as the efficiency of FDM-based AM technology.

Details

Rapid Prototyping Journal, vol. 25 no. 6
Type: Research Article
ISSN: 1355-2546

Keywords

Article
Publication date: 11 December 2017

Jim Hahn and Courtney McDonald

This paper aims to introduce a machine learning-based “My Account” recommender for implementation in open discovery environments such as VuFind among others.

Abstract

Purpose

This paper aims to introduce a machine learning-based “My Account” recommender for implementation in open discovery environments such as VuFind among others.

Design/methodology/approach

The approach to implementing machine learning-based personalized recommenders is undertaken as applied research leveraging data streams of transactional checkout data from discovery systems.

Findings

The authors discuss the need for large data sets from which to build an algorithm and introduce a prototype recommender service, describing the prototype’s data flow pipeline and machine learning processes.

Practical implications

The browse paradigm of discovery has neglected to leverage discovery system data to inform the development of personalized recommendations; with this paper, the authors show novel approaches to providing enhanced browse functionality by way of a user account.

Originality/value

In the age of big data and machine learning, advances in deep learning technology and data stream processing make it possible to leverage discovery system data to inform the development of personalized recommendations.

Details

Digital Library Perspectives, vol. 34 no. 1
Type: Research Article
ISSN: 2059-5816

Keywords

Article
Publication date: 27 May 2020

Quentin Kevin Gautier, Thomas G. Garrison, Ferrill Rushton, Nicholas Bouck, Eric Lo, Peter Tueller, Curt Schurgers and Ryan Kastner

Digital documentation techniques of tunneling excavations at archaeological sites are becoming more common. These methods, such as photogrammetry and LiDAR (Light…

Abstract

Purpose

Digital documentation techniques of tunneling excavations at archaeological sites are becoming more common. These methods, such as photogrammetry and LiDAR (Light Detection and Ranging), are able to create precise three-dimensional models of excavations to complement traditional forms of documentation with millimeter to centimeter accuracy. However, these techniques require either expensive pieces of equipment or a long processing time that can be prohibitive during short field seasons in remote areas. This article aims to determine the effectiveness of various low-cost sensors and real-time algorithms to create digital scans of archaeological excavations.

Design/methodology/approach

The authors used a class of algorithms called SLAM (Simultaneous Localization and Mapping) along with depth-sensing cameras. While these algorithms have largely improved over recent years, the accuracy of the results still depends on the scanning conditions. The authors developed a prototype of a scanning device and collected 3D data at a Maya archaeological site and refined the instrument in a system of natural caves. This article presents an analysis of the resulting 3D models to determine the effectiveness of the various sensors and algorithms employed.

Findings

While not as accurate as commercial LiDAR systems, the prototype presented, employing a time-of-flight depth sensor and using a feature-based SLAM algorithm, is a rapid and effective way to document archaeological contexts at a fraction of the cost.

Practical implications

The proposed system is easy to deploy, provides real-time results and would be particularly useful in salvage operations as well as in high-risk areas where cultural heritage is threatened.

Originality/value

This article compares many different low-cost scanning solutions for underground excavations, along with presenting a prototype that can be easily replicated for documentation purposes.

Details

Journal of Cultural Heritage Management and Sustainable Development, vol. 10 no. 4
Type: Research Article
ISSN: 2044-1266

Keywords

Article
Publication date: 24 August 2012

Hong‐Linh Truong and Schahram Dustdar

The purpose of this paper is to examine how cloud‐based information systems and services can support emerging and future requirements for sustainability governance of facilities.

1252

Abstract

Purpose

The purpose of this paper is to examine how cloud‐based information systems and services can support emerging and future requirements for sustainability governance of facilities.

Design/methodology/approach

The authors present basic elements of cloud‐based sustainability governance platforms, conduct a survey of existing industrial platforms and research works, discuss distinguishable and common characteristics of cloud computing platforms for sustainability governance, and give views on future research.

Findings

Cloud computing emerges as a potential candidate for supporting sustainability governance. However, several techniques must be provided in order to support multiple stakeholders, complex analysis and compliance processes.

Research limitations/implications

The number of industrial platforms and research works in the survey is limited, as is information about industrial platforms. Furthermore, industrial platforms are continuously updated, thus some information might be outdated.

Originality/value

There exists no survey for understanding how cloud computing could be used for sustainability governance. The paper not only helps to understand state‐of‐the‐art in using cloud computing for sustainability governance but also discusses main components, stakeholders and requirements for cloud‐based sustainability governance platforms.

Details

International Journal of Web Information Systems, vol. 8 no. 3
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 14 December 2021

Vítor Tinoco, Manuel F. Silva, Filipe N. Santos, António Valente, Luís F. Rocha, Sandro A. Magalhães and Luis C. Santos

The motivation for robotics research in the agricultural field has sparked in consequence of the increasing world population and decreasing agricultural labor…

Abstract

Purpose

The motivation for robotics research in the agricultural field has sparked in consequence of the increasing world population and decreasing agricultural labor availability. This paper aims to analyze the state of the art of pruning and harvesting manipulators used in agriculture.

Design/methodology/approach

A research was performed on papers that corresponded to specific keywords. Ten papers were selected based on a set of attributes that made them adequate for review.

Findings

The pruning manipulators were used in two different scenarios: grapevines and apple trees. These manipulators showed that a light-controlled environment could reduce visual errors and that prismatic joints on the manipulator are advantageous to obtain a higher reach. The harvesting manipulators were used for three types of fruits: strawberries, tomatoes and apples. These manipulators revealed that different kinematic configurations are required for different kinds of end-effectors, as some of these tools only require movement in the horizontal axis and others are required to reach the target with a broad range of orientations.

Originality/value

This work serves to reduce the gap in the literature regarding agricultural manipulators and will support new developments of novel solutions related to agricultural robotic grasping and manipulation.

Details

Industrial Robot: the international journal of robotics research and application, vol. 49 no. 4
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 18 October 2019

A. Kullaya Swamy and Sarojamma B.

Data mining plays a major role in forecasting the open price details of the stock market. However, it fails to address the dimensionality and expectancy of a naive…

Abstract

Purpose

Data mining plays a major role in forecasting the open price details of the stock market. However, it fails to address the dimensionality and expectancy of a naive investor. Hence, this paper aims to study a future prediction model named time series model is implemented.

Design/methodology/approach

In this model, the stock market data are fed to the proposed deep neural networks (DBN), and the number of hidden neurons is optimized by the modified JAYA Algorithm (JA), based on the fitness function. Hence, the algorithm is termed as fitness-oriented JA (FJA), and the proposed model is termed as FJA-DBN. The primary objective of this open price forecasting model is the minimization of the error function between the modeled and actual output.

Findings

The performance analysis demonstrates that the deviation of FJA–DBN in predicting the open price details of the Tata Motors, Reliance Power and Infosys data shows better performance in terms of mean error percentage, symmetric mean absolute percentage error, mean absolute scaled error, mean absolute error, root mean square error, L1-norm, L2-Norm and Infinity-Norm (least infinity error).

Research limitations/implications

The proposed model can be used to forecast the open price details.

Practical implications

The investors are constantly reviewing past pricing history and using it to influence their future investment decisions. There are some basic assumptions used in this analysis, first being that everything significant about a company is already priced into the stock, other being that the price moves in trends

Originality/value

This paper presents a technique for time series modeling using JA. This is the first work that uses FJA-based optimization for stock market open price prediction.

Details

Kybernetes, vol. 49 no. 9
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 4 July 2022

Christophe Schinckus, Marta Gasparin and William Green

This paper aims to contribute to recent debates about financial knowledge by opening the black box of its algorithmization to understand how information systems can…

Abstract

Purpose

This paper aims to contribute to recent debates about financial knowledge by opening the black box of its algorithmization to understand how information systems can address the major challenges related to interactions between algorithmic trading and financial markets.

Design/methodology/approach

The paper analyses financial algorithms in three steps. First, the authors introduce the phenomenon of flash crash; second, the authors conduct an epistemological analysis of algorithmization and identify three epistemological regimes – epistemic, operational and authority – which differ in terms of how they deal with financial information. Third, the authors demonstrate that a flash crash emerges when there is a disconnection between these three regimes.

Findings

The authors open the black box of financial algorithms to understand why flash crashes occur and how information technology research can address the problem. A flash crash is a very rapid and deep fall in security prices in a very short time due to an algorithmic misunderstanding of the market. Thus, the authors investigate the problem and propose an interdisciplinary approach to clarify the scope of algorithmization of financial markets.

Originality/value

To manage the misalignment of information and potential disconnection between the three regimes, the authors suggest that information technology can embrace the complexity of the algorithmization of financial knowledge by diversifying its implementation through the development of a multi-sensorial platform. The authors propose sonification as a new mechanism for capturing and understanding financial information. This approach is then presented as a new research area that can contribute to the way financial innovations interact with information technology.

Details

Journal of Systems and Information Technology, vol. 24 no. 3
Type: Research Article
ISSN: 1328-7265

Keywords

Article
Publication date: 12 March 2018

Laila Kechmane, Benayad Nsiri and Azeddine Baalal

The purpose of this paper is to solve the capacitated location routing problem (CLRP), which is an NP-hard problem that involves making strategic decisions as well as…

Abstract

Purpose

The purpose of this paper is to solve the capacitated location routing problem (CLRP), which is an NP-hard problem that involves making strategic decisions as well as tactical and operational decisions, using a hybrid particle swarm optimization (PSO) algorithm.

Design/methodology/approach

PSO, which is a population-based metaheuristic, is combined with a variable neighborhood strategy variable neighborhood search to solve the CLRP.

Findings

The algorithm is tested on a set of instances available in the literature and gave good quality solutions, results are compared to those obtained by other metaheuristic, evolutionary and PSO algorithms.

Originality/value

Local search is a time consuming phase in hybrid PSO algorithms, a set of neighborhood structures suitable for the solution representation used in the PSO algorithm is proposed in the VNS phase, moves are applied directly to particles, a clear decoding method is adopted to evaluate a particle (solution) and there is no need to re-encode solutions in the form of particles after applying local search.

Details

International Journal of Intelligent Computing and Cybernetics, vol. 11 no. 1
Type: Research Article
ISSN: 1756-378X

Keywords

1 – 10 of over 17000