Skip to end of metadata
Go to start of metadata

Brief Bio

I am a Professor in Autonomous Systems at the Queensland University of Technology's (QUT) Institute for Future Environments, and a Chief Investigator in the Australian Centre for Robotic Vision. I am known for my research into field (environmental) robotics and their application to large-scale environmental monitoring, management and change quantification, particularly in marine and aquatic systems.

I received a B.Eng in Aerospace Engineering from the Royal Melbourne Institute of Technology and a PhD from QUT. I started my professional career in 1995 as a project engineer at Roaduser Research International. Following my PhD I joined the CSIRO Autonomous Systems Laboratory in 2001. As a Principal Research Scientist at CSIRO I held various roles including project leader and the Robotics Systems and Marine Robotics team leader before moving to QUT in 2013.

I have wide research interests including vision-based navigation, vision-based learning and classification, adaptive sampling and path planning, cooperative robotics, and visual and acoustic stealth. A detailed bio is on my QUT profile.

Research Projects

I undertake a range of robotics research particularly focused around vision-based navigation and control, adaptive sampling, associative learning, and image-based habitat mapping and change quantification. This fundamental research is typically applied to help solve challenging environmental science problems. Below is a summary of current (2018) projects.

RangerBot Autonomous Underwater Vehicle

The RangerBot Autonomous Underwater Vehicle (AUV) is a novel vision-based robotic tool that has been developed to provide coral reef managers, researchers and community groups extra ‘hands and eyes’ in the water to help monitor and manage various threats on the Great Barrier Reef. This includes monitoring reef health indicators like coral bleaching and water quality, mapping and inspection, and the monitoring and control of pests like the Crown-Of-Thorns Starfish (COTS). The RangerBot AUV significantly extends the capabilities of its predecessor, the COTSbot AUV, and exploits real-time on-board vision for navigation, obstacle detection and management tasks. The RangerBot has been developed for single person deployment and operation from any size vessel or shoreline with an intuitive tablet-based interface created using feedback from key stakeholders. This project is in collaboration with the Great Barrier Reef Foundation with support from the Google Impact Challenge.

The official public release of RangerBot will be mid-August 2018. Stay tuned!

For more information check out the GBRF RangerBot page here.

Vision-based multi-robot formation control & docking

This project is developing scalable vision-based formation control algorithms that allow multiple AUVs to conduct novel swath bathymetry and benthic imagery surveys in complex coral reef environments. Using customized RangerBot AUVs, the approach exploits their on-board vision capabilities of the AUVs for a V-formation "follow-the-leader" approach to ensure complete coverage without the need for inter-robot wireless communications. A key challenge is to ensure complete minimally overlapping coverage over complex 3D benthic terrain at relatively low attitudes above the seafloor (< 2m).

In addition to the vision-based formation control algorithms, new approaches to scalable docking (deployment and retrieval) of multiple AUVs are being developed and experimentally evaluated using the RangerBot AUVs and a customized Inference ASV. This work has built on previous vision-based AUV and ASV docking research for at surface (paper) and underwater water (paper) retrieval and cooperative underwater robotic systems (paper). The overall goal is to allow a single ASV to deploy and retrieve up to 4 AUVs at sea to demonstrate large-scale autonomous benthic surveying capabilities.

Autonomous Underwater Lander (Mission orientated perception)

The Autonomous Underwater Lander (AUL) project is focused on developing advanced real-time vision systems that allows an underwater robot to automatically select benthic landing locations during its decent. The approach uses multi-resolution and multi-sensor classification which upon selection of a region of interest from multi-beam sonar maps (e.g. Halimeda bioherms), the AUL starts to descend and continuously classifies the benthos to guide the robot to a landing site which has the desired characteristics for sampling. The goal is to maximize the likelihood of obtaining relevant benthic and pore-water samples in previously unmapped scientifically relevant locations. This project is in collaboration with Mardi McNeil and Alistair Grinham who are developing the physical sampling system for the AUL (a customized RangerBot AUV).

Automated marine pest population monitoring & management: COTSbot

This project is developing advanced image processing techniques and underwater robotic platforms to detect, count and map the distribution of a range of marine pests. It expands previous research into automated marine pest classification for Crown-of-Thorns Starfish (Acanthaster planci) and Northern Pacific Sea Star (Asterias amurenis), with the goal of improving detection rates and providing tools for accurately measuring their spatial and temporal distribution. The results will assist marine scientists and authorities in understanding pest movement dynamics, their impact, and in managing threats. This project is in collaboration with Feras Dayoub with financial support from QUTbluebox.

For more information check out the COTSbot project page here.

Large-scale aquatic greenhouse gas quantification

This project is developing novel techniques for the large-scale temporal quantification of greenhouse gases (particularly methane) from inland waterways. It is uniquely combining persistent robotic platforms, image-processing, sensor networks, and automated sensors. The techniques and sampling paradigms developed in this project are providing limnologists and ecologists the ability to accurately quantify methane flux rates, improving model development and fundamental process understanding. This work is in collaboration with Alistair Grinham from The University of Queensland.

Some of the latest publications are:

Robots for Environmental Education

This project focuses on engaging primary and high-school students with robotic technology to raise awareness on key environmental issues and to encourage broader on-the-ground action to help mitigate these issues. In 2017, in collaboration with ManuelaToboaba and Tim Williams from the QUT School of Design we developed the “Plastic Waste Elimination Challenge”, an interactive technology-based activity for engaging and educating the public of the impact of plastic waste/litter on waterways. It was first showcased at the 2017 QUT Robotronica event which attracted over 22,000 people. A refined version was showcased at the 2018 World Science Festival Brisbane in collaboration with The Great Barrier Reef Foundation.

Another project, completed 2017, has focused on evaluating how non-tech savvy community groups can upscale Crown-of-Thorns Starfish control programs funded by the Dalio Foundation and the Lord Mayors Charitable Fund.

Inference: Robotic adaptive sampling

This project has created and is demonstrating new scalable adaptive sampling capabilities to enable large-scale monitoring of the environment, including dynamic and extreme events (e.g. floods, cyclones, fires) using multiple, persistent robotic sensors. To facilitate algorithm development, a novel persistent robotic system has been developed called Inference. The system consists of multiple networked robotic boats which provides an open architecture allowing researchers to evaluate new sampling algorithms on real-world processes over extended periods of time.

The Inference ASVs are currently being used in collaboration with Sara Couperthwaite for water sampling at Mount Morgan, and Alistair Grinham for large-scale green-house gas sampling on inland waterways.

For more information check out the project page here.

High-speed Autonomous Surface Vehicle (ASV) control in narrow waterways

The goal of this project is to develop novel adaptive controllers and vision-based perception and trajectory planners for high-speed ASV's in narrow and cluttered waterways. Target operating scenarios include previously unmapped creeks and high flow-rate flooded rivers with limited and/or unreliable GPS localization to perform tasks such as sample collection and help with swift water rescue operations. A prototype jet powered ASV has been developed with a maximum speed of 22 knots. It has on-board cameras and LiDAR for situational awareness and a computer for real-time processing and control.

The latest conference paper describing an experimental evaluation of an adaptive receding horizon controller for the high-speed ASV can be found here.

Maritime RobotX Challenge

The  Maritime RobotX Challenge is an international student competition with the goal of significantly advancing the autonomy capabilities of Autonomous Surface Vehicles on a range of complex real-world tasks. The RobotX Challenge started in 2014 and is held every two years. QUT was selected as one of three teams to represent Australia at the Maritime RobotX Challenge that took place in Singapore in 2014. The Challenge sponsors provides all competing teams with a WAM-V USV manufactured by Marine Advanced Research Inc., however, they are supplied with no propulsion, power, sensors or computing hardware. TeamQUT has fitted their platform with an electric propulsion system and various localization and perception sensors, as well as high-performance computing and communication hardware. TeamQUT consists of a group of enthusiastic students studying a range of engineering majors, including mechatronics, electrical, and computer and software systems. The team has developed and continually refine a set of novel vision-based target detection systems, as well as LiDAR and Radar based mapping and path planning systems which allowed them to place 3rd overall in Singapore (2014) and place 2nd overall in Hawaii (2016). The next RobotX Challenge will be held in December 2018 with TeamQUT fielding a new team with improved hardware and software systems.

For more information check out TeamQUT's 2018 official website.

Archive: TeamQUT 2016 Journal Paper

Archive: TeamQUT's 2014 official website.

Visual and Acoustic Stealth

Tracking dynamic targets without being detected requires not only visual but also acoustic stealth. Our goal is to significantly extend both these concepts by uniquely combining visual and acoustic stealth to maintain continuous line-of-sight observation to a moving natural object of interest, such as wild animals, in outdoor environments without being detected. We have demonstrated the combined acoustic and visual stealth approach for covertly tracking a moving target and more recently extended this for the robot to recognise and use shadows as more discreet vantage points (paper). This work is in collaboration with Ashley Tews from CSIRO.

Extreme Robotics: Robots vs volcano

Sometimes it can be fun to push robotics to the extreme. Since 2014, in collaboration with Alistair Grinham and Simon Albert from The University of Queensland, we have developing very low-cost (essentially disposable) imaging and robotic sampling systems to explore one of the most active submarine volcanoes in the South Pacific called Kavachi. Initially this was just a fun activity to blow up some robots in a volcano, but from the data we collected, we discovered some amazing things such as sharks living in one of the most hostile places on earth. We were lucky enough to team up with explorers from National Geographic who produced a number of cool YouTube videos on what we were finding. The next trip is planned for late 2018 with some even cooler robotic tech.

Video: Robot vs Volcano

Video: Sharks Discovered Inside Underwater Volcano

Also there is a journal paper on the latest findings: Exploring the “Sharkcano”: Biogeochemical observations of the Kavachi submarine volcano (Solomon Islands).

Robots Past and Present

Over the last 14 years I have developed and built many field robot platforms for the sea, land and air domains. Particular emphasis has been on applying them to undertake complex tasks and answering specific questions particularly relating to environmental science.

Present Robots

Robots developed and used since joining QUT in June 2013.


Past Robots

These are the many robots I worked on and developed whilst working at the CSIRO Autonomous Systems Laboratory.



Complete publication list and citation analysis is available from Google Scholar.

You can also access most of my publications at the QUT ePrints repository.

Selected Papers

Dunbabin, M. and Grinham, A. (2017). Quantifying Spatiotemporal Greenhouse Gas Emissions Using Autonomous Surface Vehicles, Journal of Field Robotics, 34(1),  pp 151-169.

Phillips, B.T., Dunbabin, M., Henning, B., Howell, C., DeCiccio, A., Flinders, A., Kelley, K.A., Scott, J.J., Albert, S., Carey, S., Tsadok, R. and Grinham, A. (2016). Exploring the "Sharkcano";Biogeochemical observations of the Kavachi submarine volcano (Solomon Islands). Oceanography, 29(4), pp. 160-169.

Dunbabin, M. and Marques, L. (2012). Robotics for environmental monitoring: Significant advancments & applications, IEEE Robotics & Automation Magazine, 19(1), pp. 24-39.

Grinham, A., Dunbabin, M., Gale, D. and Udy, J. (2011). Quantification of ebullitive and diffusive methane release to atmosphere from a from a water storage, Atmospheric Environment, 45(39), pp. 7166-7173, doi:10.1016/j.atmosenv.2011.09.011.

Roser, M., Dunbabin, M., and Geiger, A. (2014). Simultaneous Underwater Visibility Assessment, Enhancement and Improved Stereo, In Proc. International Conference on Robotics & Automation (ICRA), Hong Kong, Accepted 14 January 2014.

Witt, J., and Dunbabin, M. (2008). Go with the flow: Optimal AUV path planning in coastal environments. In Proc. 2008 Australasian Conference on Robotics & Automation, Canberra, pp. 1-9 (online proceedings).

Dunbabin, M., Corke, P., Vascilescu, I., and Rus, D. (2006). Data muling over underwater wireless sensor networks using autonomous underwater vehicles. In Proc. International Conference on Robotics & Automation (ICRA), pp. 2091-2098.

Vasilescu, I., Kotay, K., Rus, D., Dunbabin, M., and Corke, P. (2005). Data collection, storage and retrieval with an underwater sensor network. In Proc. IEEE SenSys, pp.154-165.

Dunbabin, M., Roberts, J., Usher K., Winstanley, G., and Corke, P. (2005). A hybrid AUV design for shallow water reef navigation.  In Proc. of the International Conference on Robotics & Automation (ICRA), April, pp. 2117-2122.

Contact Details

Dr Matthew Dunbabin | Professor (Autonomous Systems)

Institute for Future Environments | School of Electrical Engineering and Computer Science

Science and Engineering Faculty | Queensland University of Technology

phone: + 61 7 3138 0392 | fax: + 61 7 3138 1469

Gardens Point, S Block 1107 | 2 George Street, Brisbane, QLD 4000 | CRICOS No. 00213J


  • No labels