Skip to end of metadata
Go to start of metadata

Pilot Project Grant on "Automated Environmental Change Monitoring"

This page is an informal account of our grant to conduct some innovative research and pilot trials in Automated Environmental Change Monitoring, in collaboration with expert field roboticist Matthew Dunbabin and leading ecologist Jennifer Firn. We are adapting some of the robotic navigation algorithms (SeqSLAM for example) to automatically align data from multiple surveying passes of an environment that is changing over time. Monitoring of environments and in particular environmental change is a highly topical scientific research area, both internationally and in Australia. This project will develop technology demonstrators for monitoring and quantifying environmental change over time in a range of ecological and environmental scenarios.


Paper Reference: 

Milford, Michael, et al. "Automated sensory data alignment for environmental and epidermal change monitoring." Australasian Conference on Robotics and Automation 2014. Australian Robotic and Automation Association, 2014.

Paper download here

Partially funded by a QUT School of Electrical Engineering and Computer Science Pilot Project Grant


This is a somewhat dynamic page showcasing the tools we can bring to apply to applied areas such as environmental change monitoring. In essence, we have the ability to sychronize visual imagery from traverses of an environment under very different conditions or across significant environmental changes, such as foliage changes, coral reef changes etc... The idea being that you could then use this synchronized surveillance as is, or run further algorithms such as volumetric analysis or fauna / flora recognition algorithms.

Most of these videos are merely proof of concept studies, but should give enough of an idea of how you could expand the idea to more specific applications. YouTube also plays havoc with the image quality, the original videos are better quality (be sure to set to HD too).

Automated Alignment of World War I/II Aerial Imagery with Modern Satellite Imagery

A process normally done laboriously by hand and in high demand by historians, documentary makers and many more, we can automate this process. GPS information is not available on old photographs so image-based alignment is crucial.

  1. Bailleul, France was severely bombed in 1917. Below is an automatic alignment of aerial imagery from WWI with the modern imagery. 
  2. Battle of Passchendaele, 1917.
  3. Berlin Brandenburg Gate 1945 matched to current day Google maps with 3D buildings
  4. 1942 Brisbane - current day Brisbane.
  5. QUT old partliament house to current day



Kroombit Tops First Runs

Thanks to Eugene Mason. Frame correspondences between first and second runs, sample aligned imagery (foggy rain to sunny daytime, relevant to Matt's fairweather bias observations), vertical and horizontal alternating images showing alignment accuracy.




Some animated globally and locally aligned GIFs:

Forest canopy

Freshly cut grassland

Foliage alongside pathway



Greenhouse border

Grass verge of forest

Lantana Trimming


Collage of Some Change Alignment so Far

Plantations, farms, creek beds, foliage, day-night aerial imagery, pre-post aerial tsunami imagery, pre-post aerial flooding imagery, pre-post aerial bushfire imagery.

Strawberry Farm

Pine Forest Plantation

O'Reillys Synchronized Surveys

Forward and sideways facing cameras.


Aerial Before, During and After Localization and Synchronization

Using a modified version of SeqSLAM to localize within (and in the process synchronize) aerial imagery before, during and after flooding and bushfire events. The first flooding video in particular is using aerial data gathered by ARCAA against separately gathered data from Nearmaps during the Brisbane 2011 floods. Bushfire is from Toodyay bushfires.

Flooding 1

Matched Sequence

Flooding 2

Matched Sequence


Matched Sequence

3D medium resolution reconstruction of the botanic gardens area and mapped path:


Grass Before and After Mowing

Re-run with crisper imagery:

Sample frame matches

Note the before-after mowing grass change and the significant variation in lighting and moderate camera viewpoint change (click on images for larger version):


Older dataset with blurred imagery due to low light, easily fixable (see above).

Forward View

Overhead View

Aerial Surveillance

Actual ARCAA dataset, place recognition and imagery synchronization.

Samford Ecological Reserve

Overview video and example of synchronizing consecutive traverses through the environment - this process should still work just as well even if you change the season, weather, time of day, or even cut down half the trees (perhaps all the trees). You could use the synchronization for qualitative assessment or run a subsequent volumetric analysis to analyse vegetation growth, change. Or even run plant and fauna recognition algorithms and use the synchronized footage to see the change.

Video fly through of 3D model:

Sideways Model:


Gravel (Dry weather / flood analogue)

Since it's difficult to get during and after aerial flood imagery, this is a mock up using a fairly perceptually ambiguous gravel surface with "flooding".

Video fly through of 3D model: 



Other Examples

This short animation shows an example of matching images across day-night cycles using our algorithms.


  • No labels

1 Comment

  1. Pretty nice!  For the grass growing example it's hard to see the match, do you have any ground truth to support that?