Search results
1 – 10 of 23
The purpose of this paper is to discuss the autonomous navigation and guidance scheme for future precise and safe planetary landing.
Abstract
Purpose
The purpose of this paper is to discuss the autonomous navigation and guidance scheme for future precise and safe planetary landing.
Design/methodology/approach
Autonomous navigation and guidance schemes are proposed based on inertial measurement unit (IMU) and optical navigation sensors for precise and safe landing of spacecrafts on the moon and planetary bodies. First, vision‐aided inertial navigation scheme is suggested to achieve precise relative navigation; second, two autonomous obstacle detection algorithms, based on grey image from optical navigation camera and digital elevation map form light detection and ranging sensor, respectively, are proposed; and third, flowchart of automatic obstacle avoidance maneuver is also given out.
Findings
This paper finds that the performance of the proposed scheme precedes the traditional planetary landing navigation and guidance mode based on IMU and deep space network.
Research limitations/implications
The presented schemes need to be further validated by the mathematical simulations and hardware‐in‐loop simulations, and then they can be used in the real flight missions.
Practical implications
The presented schemes are applicable to both future planetary pin‐point landing missions and sample return missions with little modification.
Originality/value
This paper presents the new autonomous navigation and guidance scheme in order to achieve the precise and safe planetary landing.
Details
Keywords
Xianglong Kong, Wenqi Wu, Lilian Zhang, Xiaofeng He and Yujie Wang
This paper aims to present a method for improving the performance of the visual-inertial navigation system (VINS) by using a bio-inspired polarized light compass.
Abstract
Purpose
This paper aims to present a method for improving the performance of the visual-inertial navigation system (VINS) by using a bio-inspired polarized light compass.
Design/methodology/approach
The measurement model of each sensor module is derived, and a robust stochastic cloning extended Kalman filter (RSC-EKF) is implemented for data fusion. This fusion framework can not only handle multiple relative and absolute measurements, but can also deal with outliers, sensor outages of each measurement module.
Findings
The paper tests the approach on data sets acquired by a land vehicle moving in different environments and compares its performance against other methods. The results demonstrate the effectiveness of the proposed method for reducing the error growth of the VINS in the long run.
Originality/value
The main contribution of this paper lies in the design/implementation of the RSC-EKF for incorporating the homemade polarized light compass into visual-inertial navigation pipeline. The real-world tests in different environments demonstrate the effectiveness and feasibility of the proposed approach.
Details
Keywords
Boxin Zhao, Olaf Hellwich, Tianjiang Hu, Dianle Zhou, Yifeng Niu and Lincheng Shen
This study aims to investigate if smartphone sensors can be used in an unmanned aerial vehicle (UAV) localization system. With the development of technology, smartphones have been…
Abstract
Purpose
This study aims to investigate if smartphone sensors can be used in an unmanned aerial vehicle (UAV) localization system. With the development of technology, smartphones have been tentatively used in micro-UAVs due to their lightweight, inexpensiveness and flexibility. In this study, a Samsung Galaxy S3 smartphone is selected as an on-board sensor platform for UAV localization in Global Positioning System (GPS)-denied environments and two main issues are investigated: Are the phone sensors appropriate for UAV localization? If yes, what are the boundary conditions of employing them?
Design/methodology/approach
Efficient accuracy estimation methodologies for the phone sensors are proposed without using any expensive instruments. Using these methods, one can estimate his phone sensors accuracy at any time without special instruments. Then, a visual-inertial odometry scheme is introduced to evaluate the phone sensors-based path estimation performance.
Findings
Boundary conditions of using smartphone in a UAV navigation system are found. Both indoor and outdoor localization experiments are carried out and experimental results validate the effectiveness of the boundary conditions and the corresponding implemented scheme.
Originality/value
With the phone as a payload, UAVs can be further realized in smaller scale at lower cost, which will be used widely in the field of industrial robots.
Details
Keywords
Hang Guo, Xin Chen, Min Yu, Marcin Uradziński and Liang Cheng
In this study, an indoor sensor information fusion positioning system of the quadrotor unmanned aerial vehicle (UAV) was investigated to solve the problem of unstable indoor…
Abstract
Purpose
In this study, an indoor sensor information fusion positioning system of the quadrotor unmanned aerial vehicle (UAV) was investigated to solve the problem of unstable indoor flight positioning.
Design/methodology/approach
The presented system was built on Light Detection and Ranging (LiDAR), Inertial Measurement Unit (IMU) and LiDAR-Lite devices. Based on this, one can obtain the aircraft's current attitude and the position vector relative to the target and control the attitudes and positions of the UAV to reach the specified target positions. While building a UAV positioning model relative to the target for indoor positioning scenarios under limited Global Navigation Satellite Systems (GNSS), the system detects the environment through the NVIDIA Jetson TX2 (Transmit Data) peripheral sensor, obtains the current attitude and the position vector of the UAV, packs the data in the format and delivers it to the flight controller. Then the flight controller controls the UAV by calculating the posture to reach the specified target position.
Findings
The authors used two systems in the experiment. The first is the proposed UAV, and the other is the Vicon system, our reference system for comparison purposes. Vicon positioning error can be considered lower than 2 mm from low to high-speed experiments. After comparison, experimental results demonstrated that the system could fully meet the requirements (less than 50 mm) in real-time positioning of the indoor quadrotor UAV flight. It verifies the accuracy and robustness of the proposed method compared with that of Vicon and achieves the aim of a stable indoor flight preliminarily.
Originality/value
Vicon positioning error can be considered lower than 2 mm from low to high-speed experiments. After comparison, experimental results demonstrated that the system could fully meet the requirements (less than 50 mm) in real-time positioning of the indoor quadrotor UAV flight. It verifies the accuracy and robustness of the proposed method compared with that of Vicon and achieves the aim of a stable indoor flight preliminarily.
Details
Keywords
Yanwu Zhai, Haibo Feng and Yili Fu
This paper aims to present a pipeline to progressively deal with the online external parameter calibration and estimator initialization of the Stereo-inertial measurement unit…
Abstract
Purpose
This paper aims to present a pipeline to progressively deal with the online external parameter calibration and estimator initialization of the Stereo-inertial measurement unit (IMU) system, which does not require any prior information and is suitable for system initialization in a variety of environments.
Design/methodology/approach
Before calibration and initialization, a modified stereo tracking method is adopted to obtain a motion pose, which provides prerequisites for the next three steps. Firstly, the authors align the pose obtained with the IMU measurements and linearly calculate the rough external parameters and gravity vector to provide initial values for the next optimization. Secondly, the authors fix the pose obtained by the vision and restore the external and inertial parameters of the system by optimizing the pre-integration of the IMU. Thirdly, the result of the previous step is used to perform visual-inertial joint optimization to further refine the external and inertial parameters.
Findings
The results of public data set experiments and actual experiments show that this method has better accuracy and robustness compared with the state of-the-art.
Originality/value
This method improves the accuracy of external parameters calibration and initialization and prevents the system from falling into a local minimum. Different from the traditional method of solving inertial navigation parameters separately, in this paper, all inertial navigation parameters are solved at one time, and the results of the previous step are used as the seed for the next optimization, and gradually solve the external inertial navigation parameters from coarse to fine, which avoids falling into a local minimum, reduces the number of iterations during optimization and improves the efficiency of the system.
Details
Keywords
Chang Chen and Hua Zhu
This study aims to present a visual-inertial simultaneous localization and mapping (SLAM) method for accurate positioning and navigation of mobile robots in the event of global…
Abstract
Purpose
This study aims to present a visual-inertial simultaneous localization and mapping (SLAM) method for accurate positioning and navigation of mobile robots in the event of global positioning system (GPS) signal failure in buildings, trees and other obstacles.
Design/methodology/approach
In this framework, a feature extraction method distributes features on the image under texture-less scenes. The assumption of constant luminosity is improved, and the features are tracked by the optical flow to enhance the stability of the system. The camera data and inertial measurement unit data are tightly coupled to estimate the pose by nonlinear optimization.
Findings
The method is successfully performed on the mobile robot and steadily extracts the features on low texture environments and tracks features. The end-to-end error is 1.375 m with respect to the total length of 762 m. The authors achieve better relative pose error, scale and CPU load than ORB-SLAM2 on EuRoC data sets.
Originality/value
The main contribution of this study is the theoretical derivation and experimental application of a new visual-inertial SLAM method that has excellent accuracy and stability on weak texture scenes.
Details
Keywords
Tianmiao Wang, Chaolei Wang, Jianhong Liang and Yicheng Zhang
The purpose of this paper is to present a Rao–Blackwellized particle filter (RBPF) approach for the visual simultaneous localization and mapping (SLAM) of small unmanned aerial…
Abstract
Purpose
The purpose of this paper is to present a Rao–Blackwellized particle filter (RBPF) approach for the visual simultaneous localization and mapping (SLAM) of small unmanned aerial vehicles (UAVs).
Design/methodology/approach
Measurements from inertial measurement unit, barometric altimeter and monocular camera are fused to estimate the state of the vehicle while building a feature map. In this SLAM framework, an extra factorization method is proposed to partition the vehicle model into subspaces as the internal and external states. The internal state is estimated by an extended Kalman filter (EKF). A particle filter is employed for the external state estimation and parallel EKFs are for the map management.
Findings
Simulation results indicate that the proposed approach is more stable and accurate than other existing marginalized particle filter-based SLAM algorithms. Experiments are also carried out to verify the effectiveness of this SLAM method by comparing with a referential global positioning system/inertial navigation system.
Originality/value
The main contribution of this paper is the theoretical derivation and experimental application of the Rao–Blackwellized visual SLAM algorithm with vehicle model partition for small UAVs.
Details
Keywords
Rokas Jurevičius and Virginijus Marcinkevičius
The purpose of this paper is to present a new data set of aerial imagery from robotics simulator (AIR). AIR data set aims to provide a starting point for localization system…
Abstract
Purpose
The purpose of this paper is to present a new data set of aerial imagery from robotics simulator (AIR). AIR data set aims to provide a starting point for localization system development and to become a typical benchmark for accuracy comparison of map-based localization algorithms, visual odometry and SLAM for high-altitude flights.
Design/methodology/approach
The presented data set contains over 100,000 aerial images captured from Gazebo robotics simulator using orthophoto maps as a ground plane. Flights with three different trajectories are performed on maps from urban and forest environment at different altitudes, totaling over 33 kilometers of flight distance.
Findings
The review of previous research studies show that the presented data set is the largest currently available public data set with downward facing camera imagery.
Originality/value
This paper presents the problem of missing publicly available data sets for high-altitude (100‒3,000 meters) UAV flights; the current state-of-the-art research studies performed to develop map-based localization system for UAVs depend on real-life test flights and custom-simulated data sets for accuracy evaluation of the algorithms. The presented new data set solves this problem and aims to help the researchers to improve and benchmark new algorithms for high-altitude flights.
Details
Keywords
Erliang Yao, Hexin Zhang, Haitao Song and Guoliang Zhang
To realize stable and precise localization in the dynamic environments, the authors propose a fast and robust visual odometry (VO) approach with a low-cost Inertial Measurement…
Abstract
Purpose
To realize stable and precise localization in the dynamic environments, the authors propose a fast and robust visual odometry (VO) approach with a low-cost Inertial Measurement Unit (IMU) in this study.
Design/methodology/approach
The proposed VO incorporates the direct method with the indirect method to track the features and to optimize the camera pose. It initializes the positions of tracked pixels with the IMU information. Besides, the tracked pixels are refined by minimizing the photometric errors. Due to the small convergence radius of the indirect method, the dynamic pixels are rejected. Subsequently, the camera pose is optimized by minimizing the reprojection errors. The frames with little dynamic information are selected to create keyframes. Finally, the local bundle adjustment is performed to refine the poses of the keyframes and the positions of 3-D points.
Findings
The proposed VO approach is evaluated experimentally in dynamic environments with various motion types, suggesting that the proposed approach achieves more accurate and stable location than the conventional approach. Moreover, the proposed VO approach works well in the environments with the motion blur.
Originality/value
The proposed approach fuses the indirect method and the direct method with the IMU information, which improves the localization in dynamic environments significantly.
Details
Keywords
Yanwu Zhai, Haibo Feng, Haitao Zhou, Songyuan Zhang and Yili Fu
This paper aims to propose a method to solve the problem of localization and mapping of a two-wheeled inverted pendulum (TWIP) robot on the ground using the Stereo–inertial…
Abstract
Purpose
This paper aims to propose a method to solve the problem of localization and mapping of a two-wheeled inverted pendulum (TWIP) robot on the ground using the Stereo–inertial measurement unit (IMU) system. This method reparametrizes the pose according to the motion characteristics of TWIP and considers the impact of uneven ground on vision and IMU, which is more adaptable to the real environment.
Design/methodology/approach
When TWIP moves, it is constrained by the ground and swings back and forth to maintain balance. Therefore, the authors parameterize the robot pose as SE(2) pose plus pitch according to the motion characteristics of TWIP. However, the authors do not omit disturbances in other directions but perform error modeling, which is integrated into the visual constraints and IMU pre-integration constraints as an error term. Finally, the authors analyze the influence of the error term on the vision and IMU constraints during the optimization process. Compared to traditional algorithms, the algorithm is simpler and better adapt to the real environment.
Findings
The results of indoor and outdoor experiments show that, for the TWIP robot, the method has better positioning accuracy and robustness compared with the state-of-the-art.
Originality/value
The algorithm in this paper is proposed for the localization and mapping of a TWIP robot. Different from the traditional positioning method on SE(3), this paper parameterizes the robot pose as SE(2) pose plus pitch according to the motion of TWIP and the motion disturbances in other directions are integrated into visual constraints and IMU pre-integration constraints as error terms, which simplifies the optimization parameters, better adapts to the real environment and improves the accuracy of positioning.
Details