Search results

1 – 10 of over 2000
Article
Publication date: 19 June 2009

Hayato Omori, Taro Nakamura and Takayuki Yada

An earthworm moves by peristaltic crawling which brings a large surface into contact during motions and requires less space than other mechanisms. A peristaltic crawling is…

1469

Abstract

Purpose

An earthworm moves by peristaltic crawling which brings a large surface into contact during motions and requires less space than other mechanisms. A peristaltic crawling is suitable for moving in excavated space by an anterior (front) of a robot. Therefore, a peristaltic crawling robot is useful for an underground explorer. The purpose of this paper is to develop a peristaltic crawling robot with several parallel links and compare with motion of an actual earthworm. Then we had some experiments on a plane surface and in a tube, and in vertical perforated dirt.

Design/methodology/approach

The proposed robot, which consists of several parallel mechanisms, has four units for being controlled in 3‐DOF. A unit expands in a radial direction when it contracts to increase the friction between the unit and surroundings. Dustproof covering is attached for preventing dirt from getting inside units. Locomotion mechanism is as the same as an actual earthworm's peristaltic crawling. The robot makes an anterior unit contract, and then the contraction propagates towards the posterior (rear). Therefore, it requires no more space than that of an excavation part on the front of the robot.

Findings

It was found that three units of robot consists of several parallel mechanisms had wide range of manipulation; four units of robot moves with peristaltic crawling compared with motion of an actual earthworm. It was confirmed that the robot could turn on a plane surface and move upward and downward in a vertical pipe. Finally, the robot could move in vertical perforated dirt faster than in a pipe.

Originality/value

The robot is designed with several parallel links and equipped with dustproof covering. The locomotion of an actual earthworm is videotaped and analysed for comparing with the analysed movements of the robot. It was confirmed the robot could move with peristaltic crawling and turn on a plane surface. In addition, it was confirmed that some experiments were done in a narrow pipe and in vertical perforated dirt.

Details

Industrial Robot: An International Journal, vol. 36 no. 4
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 18 October 2019

Peng Wang, Chunxiao Song, Xiaoqiang Li and Peng Luo

The gait planning and control of quadruped crawling robot affect the stability of the robot walking on a slope. The control includes the position control in the swing phase, the…

Abstract

Purpose

The gait planning and control of quadruped crawling robot affect the stability of the robot walking on a slope. The control includes the position control in the swing phase, the force control in the support phase and the switching control in the force/position switching. To improve the passing ability of quadruped crawling robot on a slope, this paper aims to propose a soft control strategy.

Design/methodology/approach

The strategy adopts the statically stable crawling gait as the main gait. As the robot moves forward, the position/force section switching control is adopted. When the foot does not touch the ground, the joint position control based on the variable speed PID is performed. When the foot touches the ground, the position-based impedance control is performed, and a fuzzy multi-model switching control based on friction compensation is proposed to achieve smooth switching of force and position.

Findings

The proposed method offers a solution for stable passage in slope environment. The quadruped crawling robot can realize smooth switching of force/position, precise positioning in the swing process and soft control of force in the supporting phase. This fact is verified by simulation and test.

Originality/value

The method presented in this paper takes advantage of minimal tracking errors and minimal jitters. Simulations and tests were performed to evaluate the performance.

Details

Industrial Robot: the international journal of robotics research and application, vol. 47 no. 1
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 6 April 2022

Peng Wang, Chunxiao Song, Renquan Dong, Peng Zhang, Shuang Yu and Hao Zhang

Aiming at the problem that quadruped crawling robot is easy to collide and overturn when facing obstacles and bulges in the process of complex slope movement, this paper aims to…

Abstract

Purpose

Aiming at the problem that quadruped crawling robot is easy to collide and overturn when facing obstacles and bulges in the process of complex slope movement, this paper aims to propose an obstacle avoidance gait planning of quadruped crawling robot based on slope terrain recognition.

Design/methodology/approach

First, considering the problem of low uniformity of feature points in terrain recognition images under complex slopes, which leads to too long feature point extraction time, an improved ORB (Oriented FAST and Rotated BRIEF) feature point extraction method is proposed; second, when the robot avoids obstacles or climbs over bumps, aiming at the problem that the robustness of a single step cannot satisfy the above two motions at the same time, the crawling gait is planned according to the complex slope terrain, and a robot obstacle avoidance gait planning based on the artificial potential field method is proposed. Finally, the slope walking experiment is carried out in the Robot Operating System.

Findings

The proposed method provides a solution for the efficient walking of robot under slope. The experimental results show that the extraction time of the improved ORB extraction algorithm is 12.61% less than the original ORB extraction algorithm. The vibration amplitude of the robot’s centroid motion curve is significantly reduced, and the contact force is reduced by 7.76%. The time it takes for the foot contact force to stabilize has been shortened by 0.25 s. This fact is verified by simulation and test.

Originality/value

The method proposed in this paper uses the improved feature point recognition algorithm and obstacle avoidance gait planning to realize the efficient walking of quadruped crawling robot on the slope. The walking stability of quadruped crawling robot is tested by prototype.

Details

Industrial Robot: the international journal of robotics research and application, vol. 49 no. 5
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 14 June 2013

Chao Liu and Yan‐An Yao

The purpose of this paper is to propose a spatial six‐link RRCCRR (where R denotes a revolute joint, and C denotes a cylindric joint) mechanism to be used as the mechanism body of…

Abstract

Purpose

The purpose of this paper is to propose a spatial six‐link RRCCRR (where R denotes a revolute joint, and C denotes a cylindric joint) mechanism to be used as the mechanism body of a biped robot with three translations (3T) manipulation ability.

Design/methodology/approach

This biped RRCCRR mechanism can reach any position on the ground by a crawling mode or alternatively, a somersaulting mode. After the robot reaches a designated position, it can work in manipulation mode. Mobility, walking mode, kinematic and stability analyses are performed, respectively.

Findings

Based on this biped RRCCRR mechanism, a biped 3T lifter which can be used in industry is designed and analyzed. Finally, the proposed concept is verified by experiments on a prototype.

Originality/value

The work presented in this paper is one of new explorations to apply traditional spatial linkage mechanisms to the field of biped robots, and is also a new attempt to use the biped robot, that is generally used in the field of bionic robots, as a mobile manipulator robot platform in industry.

Details

Industrial Robot: An International Journal, vol. 40 no. 4
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 1 November 2023

Yifan Pan, Lei Zhang, Dong Mei, Gangqiang Tang, Yujun Ji, Kangning Tan and Yanjie Wang

This study aims to present a type of metamorphic mechanism-based quadruped crawling robot. The trunk design of the robot has a metamorphic mechanism, which endows it with…

Abstract

Purpose

This study aims to present a type of metamorphic mechanism-based quadruped crawling robot. The trunk design of the robot has a metamorphic mechanism, which endows it with excellent crawling capability and adaptability in challenging environments.

Design/methodology/approach

The robot consists of a metamorphic trunk and four series-connected three-joint legs. First, the walking and steering strategy is planned through the stability and mechanics analysis. Then, the walking and steering performance is examined using virtual prototype technology, as well as the efficacy of the walking and turning strategy.

Findings

The metamorphic quadruped crawling robot has wider application due to its variable trunk configuration and excellent leg motion space. The robot can move in two modes (constant trunk and trunk configuration transformation, respectively, while walking and rotating), which exhibits outstanding stability and adaptability in the examination and verification of prototypes.

Originality/value

The design can enhance the capacity of the quadruped crawling robot to move across a complex environment. The virtual prototype technology verifies that the proposed walking and steering strategy has good maneuverability and stability, which considerably expands the application opportunity in the fields of complicated scene identification and investigation.

Details

Industrial Robot: the international journal of robotics research and application, vol. 51 no. 1
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 16 November 2015

Saed ALQARALEH, Omar RAMADAN and Muhammed SALAMAH

The purpose of this paper is to design a watcher-based crawler (WBC) that has the ability of crawling static and dynamic web sites, and can download only the updated and newly…

Abstract

Purpose

The purpose of this paper is to design a watcher-based crawler (WBC) that has the ability of crawling static and dynamic web sites, and can download only the updated and newly added web pages.

Design/methodology/approach

In the proposed WBC crawler, a watcher file, which can be uploaded to the web sites servers, prepares a report that contains the addresses of the updated and the newly added web pages. In addition, the WBC is split into five units, where each unit is responsible for performing a specific crawling process.

Findings

Several experiments have been conducted and it has been observed that the proposed WBC increases the number of uniquely visited static and dynamic web sites as compared with the existing crawling techniques. In addition, the proposed watcher file not only allows the crawlers to visit the updated and newly web pages, but also solves the crawlers overlapping and communication problems.

Originality/value

The proposed WBC performs all crawling processes in the sense that it detects all updated and newly added pages automatically without any human explicit intervention or downloading the entire web sites.

Details

Aslib Journal of Information Management, vol. 67 no. 6
Type: Research Article
ISSN: 2050-3806

Keywords

Article
Publication date: 1 February 2016

Mhamed Zineddine

– The purpose of this paper is to decrease the traffic created by search engines’ crawlers and solve the deep web problem using an innovative approach.

1382

Abstract

Purpose

The purpose of this paper is to decrease the traffic created by search engines’ crawlers and solve the deep web problem using an innovative approach.

Design/methodology/approach

A new algorithm was formulated based on best existing algorithms to optimize the existing traffic caused by web crawlers, which is approximately 40 percent of all networking traffic. The crux of this approach is that web servers monitor and log changes and communicate them as an XML file to search engines. The XML file includes the information necessary to generate refreshed pages from existing ones and reference new pages that need to be crawled. Furthermore, the XML file is compressed to decrease its size to the minimum required.

Findings

The results of this study have shown that the traffic caused by search engines’ crawlers might be reduced on average by 84 percent when it comes to text content. However, binary content faces many challenges and new algorithms have to be developed to overcome these issues. The proposed approach will certainly mitigate the deep web issue. The XML files for each domain used by search engines might be used by web browsers to refresh their cache and therefore help reduce the traffic generated by normal users. This reduces users’ perceived latency and improves response time to http requests.

Research limitations/implications

The study sheds light on the deficiencies and weaknesses of the algorithms monitoring changes and generating binary files. However, a substantial decrease of traffic is achieved for text-based web content.

Practical implications

The findings of this research can be adopted by web server software and browsers’ developers and search engine companies to reduce the internet traffic caused by crawlers and cut costs.

Originality/value

The exponential growth of web content and other internet-based services such as cloud computing, and social networks has been causing contention on available bandwidth of the internet network. This research provides a much needed approach to keeping traffic in check.

Details

Internet Research, vol. 26 no. 1
Type: Research Article
ISSN: 1066-2243

Keywords

Article
Publication date: 3 June 2019

Tran Khanh Dang, Duc Minh Chau Pham and Duc Dan Ho

Data crawling in e-commerce for market research often come with the risk of poor authenticity due to modification attacks. The purpose of this paper is to propose a novel data…

Abstract

Purpose

Data crawling in e-commerce for market research often come with the risk of poor authenticity due to modification attacks. The purpose of this paper is to propose a novel data authentication model for such systems.

Design/methodology/approach

The data modification problem requires careful examinations in which the data are re-collected to verify their reliability by overlapping the two datasets. This approach is to use different anomaly detection techniques to determine which data are potential for frauds and to be re-collected. The paper also proposes a data selection model using their weights of importance in addition to anomaly detection. The target is to significantly reduce the amount of data in need of verification, but still guarantee that they achieve their high authenticity. Empirical experiments are conducted with real-world datasets to evaluate the efficiency of the proposed scheme.

Findings

The authors examine several techniques for detecting anomalies in the data of users and products, which give the accuracy of 80 per cent approximately. The integration with the weight selection model is also proved to be able to detect more than 80 per cent of the existing fraudulent ones while being careful not to accidentally include ones which are not, especially when the proportion of frauds is high.

Originality/value

With the rapid development of e-commerce fields, fraud detection on their data, as well as in Web crawling systems is new and necessary for research. This paper contributes a novel approach in crawling systems data authentication problem which has not been studied much.

Details

International Journal of Web Information Systems, vol. 15 no. 4
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 2 October 2017

Mengni Zhang, Can Wang, Jiajun Bu, Liangcheng Li and Zhi Yu

As existing studies show the accuracy of sampling methods depends heavily on the evaluation metric in web accessibility evaluation, the purpose of this paper is to propose a…

Abstract

Purpose

As existing studies show the accuracy of sampling methods depends heavily on the evaluation metric in web accessibility evaluation, the purpose of this paper is to propose a sampling method OPS-WAQM optimized for Web Accessibility Quantitative Metric (WAQM). Furthermore, to support quick accessibility evaluation or real-time website accessibility monitoring, the authors also provide online extension for the sampling method.

Design/methodology/approach

In the OPS-WAQM method, the authors propose a minimal sampling error model for WAQM and use a greedy algorithm to approximately solve the optimization problem to determine the sample numbers in different layers. To make OPS-WAQM online, the authors apply the sampling in crawling strategy.

Findings

The sampling method OPS-WAQM and its online extension can both achieve good sampling quality by choosing the optimal sample numbers in different layers. Moreover, the online extension can also support quick accessibility evaluation by sampling and evaluating the pages in crawling.

Originality/value

To the best of the authors’ knowledge, the sampling method OPS-WAQM in this paper is the first attempt to optimize for a specific evaluation metric. Meanwhile, the online extension not only greatly reduces the serious I/O issues in existing web accessibility evaluation, but also supports quick web accessibility evaluation by sampling in crawling.

Details

Internet Research, vol. 27 no. 5
Type: Research Article
ISSN: 1066-2243

Keywords

Book part
Publication date: 14 December 2004

Mike Thelwall

Abstract

Details

Link Analysis: An Information Science Approach
Type: Book
ISBN: 978-012088-553-4

1 – 10 of over 2000