Search results

1 – 10 of over 3000
Article
Publication date: 3 April 2019

Yi Liu, Ming Cong, Hang Dong and Dong Liu

The purpose of this paper is to propose a new method based on three-dimensional (3D) vision technologies and human skill integrated deep learning to solve assembly positioning…

Abstract

Purpose

The purpose of this paper is to propose a new method based on three-dimensional (3D) vision technologies and human skill integrated deep learning to solve assembly positioning task such as peg-in-hole.

Design/methodology/approach

Hybrid camera configuration was used to provide the global and local views. Eye-in-hand mode guided the peg to be in contact with the hole plate using 3D vision in global view. When the peg was in contact with the workpiece surface, eye-to-hand mode provided the local view to accomplish peg-hole positioning based on trained CNN.

Findings

The results of assembly positioning experiments proved that the proposed method successfully distinguished the target hole from the other same size holes according to the CNN. The robot planned the motion according to the depth images and human skill guide line. The final positioning precision was good enough for the robot to carry out force controlled assembly.

Practical implications

The developed framework can have an important impact on robotic assembly positioning process, which combine with the existing force-guidance assembly technology as to build a whole set of autonomous assembly technology.

Originality/value

This paper proposed a new approach to the robotic assembly positioning based on 3D visual technologies and human skill integrated deep learning. Dual cameras swapping mode was used to provide visual feedback for the entire assembly motion planning process. The proposed workpiece positioning method provided an effective disturbance rejection, autonomous motion planning and increased overall performance with depth images feedback. The proposed peg-hole positioning method with human skill integrated provided the capability of target perceptual aliasing avoiding and successive motion decision for the robotic assembly manipulation.

Details

Industrial Robot: the international journal of robotics research and application, vol. 46 no. 1
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 7 September 2015

X. Wang, S.K. Ong and A.Y.C. Nee

This paper aims to propose and implement an integrated augmented-reality (AR)-aided assembly environment to incorporate the interaction between real and virtual components, so…

Abstract

Purpose

This paper aims to propose and implement an integrated augmented-reality (AR)-aided assembly environment to incorporate the interaction between real and virtual components, so that users can obtain a more immersive experience of the assembly simulation in real time and achieve better assembly design.

Design/methodology/approach

A component contact handling strategy is proposed to model all the possible movements of virtual components when they interact with real components. A novel assembly information management approach is proposed to access and modify the information instances dynamically corresponding to user manipulation. To support the interaction between real and virtual components, a hybrid marker-less tracking method is implemented.

Findings

A prototype system has been developed, and a case study of an automobile alternator assembly is presented. A set of tests is implemented to validate the feasibility, efficiency, accuracy and intuitiveness of the system.

Research limitations/implications

The prototype system allows the users to manipulate and assemble the designed virtual components to the real components, so that the users can check for possible design errors and modify the original design in the context of their final use and in the real-world scale.

Originality/value

This paper proposes an integrated AR simulation and planning platform based on hybrid-tracking and ontology-based assembly information management. Component contact handling strategy based on collision detection and assembly feature surfaces mating reasoning is proposed to solve component degree of freedom.

Article
Publication date: 1 December 2002

Ping Ji, Albert C.K. Choi and Lizhong Tu

The paper presents a virtual design and assembly system (VDAS) developed in a virtual reality (VR) environment. The VDAS is a VR‐based engineering application which allows…

Abstract

The paper presents a virtual design and assembly system (VDAS) developed in a virtual reality (VR) environment. The VDAS is a VR‐based engineering application which allows engineers to design, modify and assemble mechanical products. As a prototype, this system has a knowledge‐based library of standard mechanical fastening parts, so a great deal of work is reduced during the product modeling stage. Besides, product models can be directly inserted or modified with a friendly user interface in the immersive VR environment. In order to reach the target of virtual design and integrate VR and computer aided design, the system adopts the variational design approach, so a product model has not only the geometric information, but also variation information and even assembly match information. Finally, the VDAS has the function of assembly planning, and several interactive manipulations, such as part modification, assembly plan verification and modification, have been realized.

Details

Assembly Automation, vol. 22 no. 4
Type: Research Article
ISSN: 0144-5154

Keywords

Article
Publication date: 23 November 2022

Chetan Jalendra, B.K. Rout and Amol Marathe

Industrial robots are extensively used in the robotic assembly of rigid objects, whereas the assembly of flexible objects using the same robot becomes cumbersome and challenging…

Abstract

Purpose

Industrial robots are extensively used in the robotic assembly of rigid objects, whereas the assembly of flexible objects using the same robot becomes cumbersome and challenging due to transient disturbance. The transient disturbance causes vibration in the flexible object during robotic manipulation and assembly. This is an important problem as the quick suppression of undesired vibrations reduces the cycle time and increases the efficiency of the assembly process. Thus, this study aims to propose a contactless robot vision-based real-time active vibration suppression approach to handle such a scenario.

Design/methodology/approach

A robot-assisted camera calibration method is developed to determine the extrinsic camera parameters with respect to the robot position. Thereafter, an innovative robot vision method is proposed to identify a flexible beam grasped by the robot gripper using a virtual marker and obtain the dimension, tip deflection as well as velocity of the same. To model the dynamic behaviour of the flexible beam, finite element method (FEM) is used. The measured dimensions, tip deflection and velocity of a flexible beam are fed to the FEM model to predict the maximum deflection. The difference between the maximum deflection and static deflection of the beam is used to compute the maximum error. Subsequently, the maximum error is used in the proposed predictive maximum error-based second-stage controller to send the control signal for vibration suppression. The control signal in form of trajectory is communicated to the industrial robot controller that accommodates various types of delays present in the system.

Findings

The effectiveness and robustness of the proposed controller have been validated using simulation and experimental implementation on an Asea Brown Boveri make IRB 1410 industrial robot with a standard low frame rate camera sensor. In this experiment, two metallic flexible beams of different dimensions with the same material properties have been considered. The robot vision method measures the dimension within an acceptable error limit i.e. ±3%. The controller can suppress vibration amplitude up to approximately 97% in an average time of 4.2 s and reduces the stability time up to approximately 93% while comparing with control and without control suppression time. The vibration suppression performance is also compared with the results of classical control method and some recent results available in literature.

Originality/value

The important contributions of the current work are the following: an innovative robot-assisted camera calibration method is proposed to determine the extrinsic camera parameters that eliminate the need for any reference such as a checkerboard, robotic assembly, vibration suppression, second-stage controller, camera calibration, flexible beam and robot vision; an approach for robot vision method is developed to identify the object using a virtual marker and measure its dimension grasped by the robot gripper accommodating perspective view; the developed robot vision-based controller works along with FEM model of the flexible beam to predict the tip position and helps in handling different dimensions and material types; an approach has been proposed to handle different types of delays that are part of implementation for effective suppression of vibration; proposed method uses a low frame rate and low-cost camera for the second-stage controller and the controller does not interfere with the internal controller of the industrial robot.

Details

Industrial Robot: the international journal of robotics research and application, vol. 50 no. 3
Type: Research Article
ISSN: 0143-991X

Keywords

Article
Publication date: 28 January 2014

Gianmauro Fontana, Serena Ruggeri, Irene Fassi and Giovanni Legnani

The purpose of this paper was the design, development, and test of a flexible and reconfigurable experimental setup for the automatic manipulation of microcomponents, enhanced by…

Abstract

Purpose

The purpose of this paper was the design, development, and test of a flexible and reconfigurable experimental setup for the automatic manipulation of microcomponents, enhanced by an accurately developed vision-based control.

Design/methodology/approach

To achieve a flexible and reconfigurable system, an experimental setup based on 4 degrees of freedom robot and a two-camera vision system was designed. Vision-based strategies were adopted to suitably support the motion system in easily performing precise manipulation operations. A portable and flexible program, incorporating the machine vision module and the control module of the task operation, was developed. Non-conventional calibration strategies were also conceived for the complete calibration of the work-cell. The developed setup was tested and exploited in the execution of repetitive tests of the grasping and releasing of microcomponents, testing also different grasping and releasing strategies.

Findings

The system showed its ability in automatically manipulating microcomponents with two different types of vacuum grippers. The performed tests evaluated the success and precision of the part grasping and release, which is a crucial aspect of micromanipulation. The results confirm reliability in grasping and that the release is precluded by adhesive effects. Thus, different strategies were adopted to improve the efficiency in the release of stuck components without negatively affecting the accuracy nor the repeatability of the positioning.

Originality/value

This work provided a flexible and reconfigurable architecture devoted to the automatic manipulation of microcomponents, methodologies for the characterization of different vacuum microgrippers, and quantitative information about their performance, to date missing in literature.

Details

Assembly Automation, vol. 34 no. 1
Type: Research Article
ISSN: 0144-5154

Keywords

Article
Publication date: 1 October 2005

B.P. Amavasai, F. Caparrelli, A. Selvan, M. Boissenin, J.R. Travis and S. Meikle

To develop customised machine vision methods for closed‐loop micro‐robotic control systems. The micro‐robots have applications in areas that require micro‐manipulation and micro…

Abstract

Purpose

To develop customised machine vision methods for closed‐loop micro‐robotic control systems. The micro‐robots have applications in areas that require micro‐manipulation and micro‐assembly in the micron and sub‐micron range.

Design/methodology/approach

Several novel techniques have been developed to perform calibration, object recognition and object tracking in real‐time under a customised high‐magnification camera system. These new methods combine statistical, neural and morphological approaches.

Findings

An in‐depth view of the machine vision sub‐system that was designed for the European MiCRoN project (project no. IST‐2001‐33567) is provided. The issue of cooperation arises when several robots with a variety of on‐board tools are placed in the working environment. By combining multiple vision methods, the information obtained can be used effectively to guide the robots in achieving the pre‐planned tasks.

Research limitations/implications

Some of these techniques were developed for micro‐vision but could be extended to macro‐vision. The techniques developed here are robust to noise and occlusion so they can be applied to a variety of macro‐vision areas suffering from similar limitations.

Practical implications

The work here will expand the use of micro‐robots as tools to manipulate and assemble objects and devices in the micron range. It is foreseen that, as the requirement for micro‐manufacturing increases, techniques like those developed in this paper will play an important role for industrial automation.

Originality/value

This paper extends the use of machine vision methods into the micron range.

Details

Kybernetes, vol. 34 no. 9/10
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 1 August 1995

Chanan Syan and Yousef Mostefai

The automated manufacturing systems of the future can only befeasible if they have capabilities to recover automatically from faultsand errors effectively and efficiently. Reports…

624

Abstract

The automated manufacturing systems of the future can only be feasible if they have capabilities to recover automatically from faults and errors effectively and efficiently. Reports on the work carried out looking at error recovery problems in manufacturing cell controllers. Cell control systems also invariably manage and schedule work in an automated cell as well as carrying out the general tasks of communications, sequencing and recording. Presents a model for error recovery capability which uses system information, data and prior knowledge of errors to recover from system errors. Elucidates the structure and operation of the cell controller developed. Work so far has shown promise in achieving automatic recovery capability in cell control systems. Finally identifies further developments for future work.

Details

Integrated Manufacturing Systems, vol. 6 no. 4
Type: Research Article
ISSN: 0957-6061

Keywords

Article
Publication date: 1 August 2008

Peng Gaoliang, He Xu, Yu Haiquan, Hou Xin and Khalil Alipour

The virtual design environment offers users an opportunity to interact with a virtual prototyping rather than physical models and to build a fixture configuration in a realistic…

Abstract

Purpose

The virtual design environment offers users an opportunity to interact with a virtual prototyping rather than physical models and to build a fixture configuration in a realistic way. But the virtual reality (VR) environment tends to be inaccurate because humans have difficulty in performing precise positioning tasks. Therefore, it is necessary to implement precise object manipulation methods for assembly and disassembly activities, so that users can perform modular fixture configuration design efficiently in VE. The purpose of this paper is to develop a VR‐based modular fixture assembly design system, which supports the design and assembly of modular fixture configuration in a virtual environment.

Design/methodology/approach

Geometric constraint‐based method is utilized to represent and treat the assembly relationship between modular fixture elements. The paper presents a hybrid method of rule‐based reasoning and fuzzy comprehensive judgment to capture the user's operation intent and recognize geometric constraint. Through degrees of freedom based analysis, a mathematical matrix is presented for representing and reducing allowable motion of fixture elements, and a constraint‐based motion navigation approach is proposed to ensure that the manipulation of a fixture component not violate that the existing constraints.

Findings

The paper finds that the proposed techniques are applicable to the convenient manipulation and accurate positioning of fixture elements in a virtual environment.

Practical implications

Component manipulation plays a key role in interactive virtual assembly design. The proposed approach in this paper enables interactive assembly design of modular fixture in virtual environment.

Originality/value

This paper presents a geometric constraint‐based approach that realizes automatic assembly relationship recognition, constraint solving and motion navigation for interactive modular fixture assembly design in a virtual environment.

Details

Assembly Automation, vol. 28 no. 3
Type: Research Article
ISSN: 0144-5154

Keywords

Article
Publication date: 24 April 2007

Anna Eisinberg, Arianna Menciassi, Paolo Dario, Joerg Seyfried, Ramon Estana and Heinz Woern

The aim of the research is to perform an accurate micromanipulation task, the assembly of a lens system, implementing safe procedures in a flexible microrobot‐based workstation…

Abstract

Purpose

The aim of the research is to perform an accurate micromanipulation task, the assembly of a lens system, implementing safe procedures in a flexible microrobot‐based workstation for micromanipulation.

Design/methodology/approach

The approach to the micromanipulation research issue consists in designing and building a micromanipulation station based on mobile microrobots, with 5 degrees of freedom and a size of a few cm3, capable of moving and manipulating by the use of tube‐shaped and multilayered piezo‐actuators. Controlled by visual and force/tactile sensor information, the micro‐robot is able to perform manipulation with a motion resolution down to 10 nm in a telemanipulated or semi‐automated mode, thus freeing human operators from the difficult task of handling minuscule objects directly. Equipped with purposely‐developed grippers, the robot can take over high‐precise grasping, transport, manipulation and positioning of mechanical or biological micro‐objects. A computer system using PC‐compatible hardware components ensures the robot operation in real‐time.

Findings

The robots and the grippers described in this paper are highly interesting tools. Even if each specific application may require specific modifications, the proposed solution is extremely versatile, due to the ability to manipulate with a very large stroke (being the size of the base the robot works on) with a very high motion resolution. These positive aspects do make the robots very suitable also for working in a scanning electron microscope, for wafer inspection in a laboratory, and so on.

Research limitations/implications

Future work will include modifications to the existing system in order to enhance the flexibility of the workstation: e.g. other robots and other tools with different characteristics will be designed and fabricated. Research efforts will be devoted in particular to further miniaturization of the actuators.

Practical implications

This workstation can be used as a platform for assembling novel prototypes, and as a test bench for testing new assembly procedures or new products, e.g. the lens assembly procedure described in this work, even if not suitable for mass production, was useful to assess the performance of the two‐lenses assembly system itself, compared to standard systems with just one lens.

Originality/value

The system proves that the development of mobile micro‐robots is a promising approach to realise very small and flexible tools useful for different applications. By means of its intuitive teleoperation mode, the system enables the user to work in the micro‐world; due to the force feedback the user is almost immersed into the micro‐world and gets a sense for the handled object.

Details

Assembly Automation, vol. 27 no. 2
Type: Research Article
ISSN: 0144-5154

Keywords

Article
Publication date: 22 February 2011

Kiho Kim, Byung‐Suk Park, Ho‐Dong Kim, Syed Hassan and Jungwon Yoon

Hot‐cells are shielded structures protecting individuals from radioactive materials. The purpose of this paper is to propose a design approach for a hot‐cell simulator using…

Abstract

Purpose

Hot‐cells are shielded structures protecting individuals from radioactive materials. The purpose of this paper is to propose a design approach for a hot‐cell simulator using digital mock‐up (DMU) technology and combining Haptic guided complex robotic manipulation for assembly tasks in a virtual environment.

Design/methodology/approach

The principal reason for developing a simulator was to explore the feasibility of hot‐cell structure design and collision‐free assembly process. For this, a simulation design philosophy has been proposed that includes DMU facility offering the ability of analyzing the operations and performing complex robotic manipulations in the virtual hot‐cell environment. Furthermore, enhanced Haptic mapping for tele‐manipulation is proposed for training and guidance purposes.

Findings

From the analysis and task scenarios performed in virtual simulator, the optimal positions of the manipulators and need of (bridge transport dual arm servo‐manipulators) type were identified. Operation tasks were performed remotely using virtual hot‐cell technology by simulating the scenarios in the DMU reducing the overall operation cost and user training. The graphic simulator substantially reduced the cost of the process and maintenance procedure as well as the process equipment by providing a pre‐analysis of whole scenario for real manipulation.

Originality/value

This research tries to contribute to the virtual hot‐cell design philosophy. Tele‐operated complex robotic operations in DMU technology are performed in virtual hot‐cell. The simulator provides improved Haptic guidance with force and torque feedback enhancing the realism of virtual environment.

Details

Assembly Automation, vol. 31 no. 1
Type: Research Article
ISSN: 0144-5154

Keywords

1 – 10 of over 3000