Skip to end of metadata
Go to start of metadata

Persistent Robotic Navigation

Australian Research Council Discovery Project DP110103006





Robot navigation systems based on conventional maps are unable to cope with changes in their environment, constraining their real world application. This project creates lifelong robot navigation systems that explicitly account changes in the environment by using a new type of map based in both space and time. A map in space and time can accommodate changes in the shape of the environment and recover from mistakes. New methods for robot vision will account changes in appearance of places over hours, days, weeks and years. We will demonstrate the new navigation and vision capabilities on robotic platforms that continuously adapt to changes in the environment over the robot's lifetime, establishing a new benchmark in real world robotics.

Persistent Vision-Only Navigation

The aims is to enable a robot to navigate autonomously using purely visual sensors for extended periods of time. 
Active research includes: 

  • Robustness - investigating multi-hypothesis graph-based approaches to dealing with change and uncertainty in both geometry and appearance on a global and local level. 

  • Compact representations to aid localization.

  • Memory based approaches to persistent navigation.

  • Robust recovery behavior to handle the periods of vision blackout during navigation.


Dayoub, Feras, Morris, Timothy, Upcroft, Ben,  & Corke, Peter (2013)
Vision-only autonomous navigation using topometric maps.
In International Conference on Intelligent Robots and Systems (2013 IEEE/RSJ), 3-7 November 2013, Tokyo Big Sight, Tokyo.

Persistent navigation in densely crowded environments

Current state-of-the-art methods to tackle the problems of mapping, localizing, navigating and planning by mobile robots have been shown to produce desirable and promising results when dealing with each of these problems individually. However, there are few mobile robots which are able to demonstrate long-term autonomy using all these methods together while operating unsupervised in a real-life environment. One example of such an environment is a public event. The main characteristic of these events is that they are densely crowded most of the time which introduces a new set of challenges for mobile robots.


Dayoub, Feras, Morris, Timothy, Upcroft, Ben,  & Corke, Peter (2013) 
One robot, eight hours, and twenty four thousand people.
In Australasian Conference on Robotics & Automation (ACRA2013), 2-4 December 2013, Sydney, Australia.

Long-term operation in everyday environment (Bookshop as an example)

The main challenge in mapping non-stationary environments for mobile robots comes from the fact that the configuration of the environment can change in unpredictable ways. Therefore, the internal representation which the robot holds about the state of the surrounding environment can easily become invalid and out-of-date. The consequences of this fact can have catastrophic effects on the performance and the efficiency of the planning and navigation of the robot. This work presents a method to enable a mobile robot working in non-stationary environments to plan its path and localize within multiple map hypotheses simultaneously. The maps are generated using a long-term and short-term memory mechanism that ensures only persistent configurations in the environment are selected to create the maps. In order to evaluate the proposed method, experimentation is conducted in an office and a bookshop environment. Compared to navigation systems that use only one map, our system produces superior path planning and navigation in a non-stationary environment where paths can be blocked periodically, a common scenario which poses significant challenges for typical planners.

Morris Timothy, Dayoub Feras, Corke Peter, Wyeth Gordon, Upcroft Ben, (2014)
Multiple map hypotheses for planning and navigating in non-stationary environments
In IEEE International Conference on Robotics and Automation. May 31 - June 7, 2014 Hong Kong, China

  • No labels