Skip to end of metadata
Go to start of metadata

This page describes our work on vision guided flight. Quadrotors are a wonderful technology for putting an eye in the sky, but they are hard to control.  Things become much more difficult when we try and fly close to solid (and unforgiving) objects, what we call close-quaters flying.  Here you can find descriptions of our various projects, demonstration videos and relevant publications and results.

Image based visual servoing for pole inspection task

 

 

Video demonstration of "Inspection of Pole-Like Structure using Vision controlled VTOL UAV and Shared Autonomy".

Project overview

Inspecting vertical structures, such as light and power distribution poles, is a time consuming, dangerous and expensive task with high operator workload. To address these issues, we propose a VTOL platform that can operate at close-quarters, whilst maintaining a safe stand-off distance and rejecting environmental disturbances. We adopt an Image based Visual Servoing (IBVS) technique using only two line feature to stabilise the vehicle with respect to a pole. Only visual and inertial data are used, making the approach suitable for indoor or GPS-impaired environments.

Our research hexarotor platform is fitted with a front facing camera and a quad-core 1.7 GHz ARM Cortex-A9 computer which performs all the processing onboard. The front facing camera is a low-cost high-speed Playstation EyeToy connected via USB.

The IMU provides angle rates and orientation (roll, pitch, yaw) and the 3-axis acceleration. Altitude measurements are obtained from a downward-facing ultrasonic sensor. For the night time flights a high-powered LED is mounted to the front to illuminate the scene for the onboard camera.

 

We adopt shared autonomy concept in order to allow an unskilled operator to control a vehicle accurately and safely. Shared autonomy indicates that a computer accomplishes the major fraction of control. The operator's interventions for low-level control are prohibited but can still provide supervisory high-level control commands. Close quarters navigation represents navigation following specific geometries such as cables of bridges or the pole of a streetlight. This system would allow an unskilled operator to easily and safely control a quadrotor to examine locations that are difficult to reach. For example, this system could be used for practical tasks such as inspecting for bridge or streetlight defects.

The figure below illustrates the data flow from the sensors to the software components where processing occurs. Different colors denote different sampling rates and arrows denote data flow at a given frequency. Each box is an individual ROS node implemented using C++. Precision Time Protocol (PTP) is utilised for time synchronization between the onboard computer and the ground station data logger. 

Results

Results indoor/outdoor flight experiments demonstrate the system is able to successfully inspect and circumnavigate a pole. 

 

Sample onboard images from the three hovering test environments: indoor (left), day-outdoor (mid), night-outdoor (right) respectively. The top row contains the raw images while the bottom row contains the corresponding gradient images. Green vertical lines correspond to the goal line positions and the red lines correspond to the tracked pole edges.

The following figure shows top-view of ground truth trajectory (using a laser tracker Leika TS30, 5Hz, 2mm accuracy) with respect to the target coordinate for a pole inspection flight. An operator only commands yaw rate using the RC transmitter during the experiment. The constant height, 0.7m is maintained.

Related publication

Sa, Inkyu , Hrabar, Stefan and Corke, Peter, Vertical Pole-Like Structure Inspection Using a VTOL UAV and Shared Autonomy, in IEEE International Conference on Intelligent Robots and Systems, Chicago, USA, 2014

Sa, Inkyu and Corke, Peter, Improved line tracking using IMU and Vision for visual servoing, Australasian Conference on Robotics and Automation, 2013

 

 

 

100Hz Position based visual servoing for pole inspection task

 

Video demonstration of "Outdoor Flight Testing of a Pole Inspection UAV Incorporating High-Speed Vision".

Project overview

This project present a pole inspection system for outdoor environments comprising a high-speed camera on a vertical take-off and landing (VTOL) aerial platform. The pole inspection task requires a vehicle to fly close to a structure while maintaining a fixed stand-off distance from it. Typical GPS errors make GPS-based navigation unsuitable for this task however. When flying outdoors a vehicle is also affected by aerodynamics disturbances such as wind gusts, so the onboard controller must be robust to these disturbances in order to maintain the stand-off distance. Two problems must therefor be addressed: fast and accurate state estimation without GPS, and the design of a robust controller. We resolve these problems by a) performing visual + inertial relative state estimation and b) using a robust line tracker and a nested controller design. Our state estimation exploits high-speed camera images (100Hz ) and 70Hz IMU data fused in an Extended Kalman Filter (EKF). We demonstrate results from outdoor experiments for pole-relative hovering, and pole circumnavigation where the operator provides only yaw commands. Lastly, we show results for image-based 3D reconstruction and texture mapping of a pole to demonstrate the usefulness for inspection tasks.

Traditional pole inspections can require lengthly setups and road blockages for vehicle access which disrupts traffic (right). Using a small and light-weight MAV VTOL platform for inspection tasks is cost effective, quick, and safe for workers.

 

The following diagram illustrates input/output diagram for the Vertical KF and Horizontal EKF. L1 and L2 denote two observed lines in the image plane. An IMU provides angular rates, angles and acceleration measurements and the sonar module provides a height measurement. The estimated filter outputs and the desired goals are denoted with a hat and a star superscription respectively. Position, velocity controllers module output pitch, roll and thrust. We fuse different sample rates of data: the 100Hz line tracker (vision), a 70Hz IMU and a 20Hz sonar. The latest-updated measurement is used for slow update rate sensors.

Results

This section presents results for outdoor flight. We performed 32 pole-relative hovering flights using the experimental setup and observed a success rate of 78% (25/32) where the system was able to track the pole and maintain a hover position relative to it. An altitude controller maintained a constant height allowing us to evaluate the horizontal controller performance independently. The results for the period 10-60 second of a 70 second flight are summarised in the following table. 

Standard deviations of state estimation errors for the 100Hz horizontal EKF and 70Hz vertical KF.


The performance of the 100Hz horizontal EKF and 70Hz vertical KF estimators for the flight are shown in the following figure. The estimated position and velocity are evaluated by down-sampling the filter estimation result to 5Hz and computing the standard deviation of errors between this and the laser tracker ground truth.

Experimental results for position estimation while hovering. The first and second rows show the 100Hz horizontal EKF estimation with the 5Hz ground truth whereas the third one is the 70Hz vertical KF estimation with the ground truth. -1m, 0m and 0.6m are the desired position for x, y, and z respectively.

Experimental results for velocity estimation while hovering. 100Hz vx, vy (the first and second row) and 70Hz vz (the third row) with the 5Hz ground truth while hovering. The desired velocity is taken as the output of the position controller.

 

Pole reconstruction workflow. Input for reconstruction is an undistorted image sequence which are recorded at 240Hz and sub-sampled to 10Hz. Full pairwise matching is performed

 

 

3D pole reconstruction results. An original image (top left) and various views of the texture-mapped surface (top row). SfM-based camera trajectory estimation (bottom row). The scale of the trajectory is arbitrary, up-to-scale, since we use a monocular camera.

Related publication

Sa, Inkyu and Corke Peter, Close-quarters Quadrotor flying using Position Based Visual Servoing and a 100Hz Monocular Camera, in IEEE International Conference on Unmanned Aircraft Systems, 2014

Sa, Inkyu, Hrabar Stefan and Corke Peter, Outdoor Flight Testing of a Pole Inspection UAV Incorporating High-Speed Vision, In International Conference on Field and Service Robotics, Dec 9-11, 2013, Brisbane, Australia. bibtex

Sa, Inkyu and Corke, Peter, Improved line tracking using IMU and Vision for visual servoing, In Proceedings of the Australasian Conference on Robotics and Automation 2013, the University of New South Wales, Australia, bibtex, 2013

 

Autonomous MAV flight in GPS-impaired environment using a monocular camera

 

 

Video demonstration of "Monocular Vision based Autonomous Navigation for a Open-Source MAVs in GPS-denied Environments". 

Project overview

This project presents monocular vision guided autonomous navigation system for Micro Aerial Vehicles (MAVs) in GPS-impaired environments such as indoor. The major problem of a monocular system is that the depth scale of the scene can not be determined without prior knowledge or other sensors. To address this problem we solve a cost function, which consists of a drift-free altitude measurement together with up-to-scaled position estimation from a visual sensor. We also evaluate the proposed system in terms of: accuracy of scale estimator, controller performance and accuracy of state estimation by comparing with motion capture ground truthed data. All resources including source code, tutorial documentation and system models are available online.

AR.Drone platform. It has a front and a downward camera. The ultrasonic sensor estimates altitude. Red, green, and blue denotes x,y, and z axis used in this project. The embedded accurate lateral velocity estimation based on aerodynamics and optical flow, and light weight are key features of this stable product.

 

Scale estimation of a monocular camera with a down facing sonar sensor. The dash line denotes the original z (vertical axis) of PTAM, while bold solid line is the metric form z of PTAM with estimated scale. The thin solid line is the altitude measure from a sonar sensor. This plot shows that the scale is accurately recovered since the metric measurement (thin solid) and the scale recovered estimation (thick solid line) are close.

 

All software is implemented on the ROS platform and runs in real-time. The figure below shows the system implementation. Autonomy driver plays roles in retrieving and sending data from the MAV. vSLAM is a key framing based front-end SLAM solution that can provide up-to-scale position and orientation. Scale estimator runs Levenberg-Marquardt optimization given observation data to estimate scale at 18Hz. Pose estimator is an implementation of the Kalman Filter algorithm. Four PID controllers close the loop for each x,y,z, and y in PID controllers. All software is available online.

Results

 

Top and bottom rows show indoor and outdoor experiments respectively. The first column is images of the environment where the MAV is flown. The second column shows 3D points reconstruction and position estimation from vSLAM. The third column shows the corresponding images from MAV’s forward camera with feature extraction. The last column shows the top-view x,y position estimation while hovering. That clearly shows that manoeuvring outdoor is more challenge than indoor environments due to wind gusts, dynamic features, and abundant unpredictable factors.

 

 This section presents two experimental results: hovering and waypoint navigation. For the hovering, the vehicle was given a target position (x=0, y=0, and z=0.8 in metre). The following table shows accuracy of state estimation and control performance while hovering (0~90 second).

RMSE x(m)= 0.0208RMSE y(m) = 0.0748RMSE z(m) = 0.0308
RMSE roll (deg)= 0.8875RMSE pitch (deg) = 0.6054RMSE yaw (deg) = 0.7529

 

 

Top two rows are translation state estimation and the third row is raw ultrasonic height measurement with ground truth data. Note that wireless connection is lost around from 30s to 35s and it introduces inaccurate state estimation. Although it affects on control performance and state estimation, the quadrotor recovers its state tracking after re-establishing the connection.

 

The figure below shows way points following results. The flyer is given the 5 sequence target positions, (0,0) (0,1) (1,1), (0,1) and (0,0) order, where (x, y in metre). Top left shows side view of the trajectory for evaluation of z (vertical) controller while way points following. Top right is top view and it clearly shows scale estimation is recovered accurately. The bottom perspective view presents qualitatively evaluation of way points following.

Related publication

Sa, Inkyu, He Hu, Van Huynh and Corke Peter, Monocular Vision based Autonomous Navigation for a Cost-Effective Open-Source MAVs in GPS-denied Environments. In Proceedings of IEEE/ASME International Conference on Advanced Intelligent Mechatronics 2013, Wollongong, Australia. bibtexpdf

  • No labels