Skip to end of metadata
Go to start of metadata

Supervisors please note that BE801/2 projects are advertised and allocated through the inPlace system https://inplace.qut.edu.au/

 

Some of you may have encountered problems login to InPlace to add BEB801/2 project. A very common problem could be that you are not listed as InPlace user. For this reason, and any other problem you may encounter, please contact:  sef.wil@qut.edu.au or call ext. 80499 to sort it out.

Interested in doing a PhD

If you're interested in doing a PhD then consider one of these projects. It's a chance to see if you like the work/research environment before you commit 3 years of your life to a PhD!

 

We have a great range of projects in mechatronics, robotics and computer vision.  Please directly contact the supervisors associated with the projects.

Amazon Robotics Challenge 2017

 

Description

Amazon is running an annual challenge to help solve issues related to picking and interacting with a wide range of objects robustly. The challenge is framed within the scope of picking items (that amazon sells) from a shefl and into a shopping container, as well as, pick objects out of a return mail and back into the shelf for storage. We took part last year at Robocup in Germany with 4 undergrad students (see our Amazon Picking Challenge team from 2016). The competition this year will be in Japan in late July and will focus on robustly picking objects that were not seen beforehand. For this we aim to create a team (based on undergrad students) that will start from our last year's entry and create a more robust, more flexible and more efficient solution to this problem.

There is room for a wide variety of projects within the scope of the challenge this year, for example:

  1. Integration of computer vision algorithms into the current system

  2. Creating a feedback loop to create more successful grasping (eg. visual servoing)

  3. Mechanical design of an end-effector using mechanical fingers and suction

Skills

a broad range of skills are required and the various subtasks might need special skills
Generally, high GPA and highly-motivated students are preferred (it is quite likely this will take a lot of your time!)

Last year we had a code repository of >1 million Loc (code submitted to Amazon), so good programming skills are helpful
any prior knowledge of the following will definitely help:
mechanical design, computer vision, machine learning, GItHub and ROS or just building hardware and/or software systems

No. of Students

3-5

Contact

Juxi Leitner <j.leitner@qut.edu.au> for information about the challenge and being part of it
More information about the Amazon Challenge: https://www.amazonrobotics.com/#
More information about our 2016 entry: http://Juxi.net/projects/apc

Peter Corke's Projects

About Peter. Project details below are brief, contact me to discuss. If you email me please be sure to put "BEB project" in the subject line, that way I'll notice it.  

1: Image processing using FPGA hardware

Description

FPGAs offer a means for high-performance computation at low electrical power consumption. To date programming FPGA hardware has been cumbersome. This project investigates the use of MATLAB HDL Coder tools to embed algorithms into Xilinx Zynq FPGA hardware.

The following tasks are required

  1. Procure the Zynq hardware, suitable camera and commission it.

  2. Create a number of image processing pipelines, code in Simulink and embed in the Zynq hardware. Investigate maximum possible frame rates and power consumption.
  3. Embed a deep neural network inference system in the Zynq hardware.

Skills

familiar with MATLAB, image processing, FPGA hardware and electronics.

No. of Students

1

2: Integrating MATLAB Robotics Toolbox to graphical robot simulators

Description

Extend the existing Robotics Toolbox to give it the capability to interact with graphical robot simulation environments such as V-REP and Gazebo. This will allow programs written in MATLAB to process virtual sensor data and move the robot around the simulated world to achieve useful tasks such as the TRS task.

The following tasks are required

  1. Design and implement new MATLAB classes that provide the Toolbox with a uniform interface to at least V-REP and Gazebo.

  2. Implement a mobile robotic pick and place task such as TRS using MATLAB and the Robotics Toolbox

Skills

Familiar with MATLAB, object oriented programming, V-REP, Gazebo and ROS

No. of Students

1

3: Expressway cam

Description

Install a camera and RaspberryPi computer in the S11 common area that gives real time reports (via a webpage) of traffic conditions.

The following tasks are required

  1. Develop algorithms to process images and determine speed and density of traffic in each lane (day and night) using MATLAB. Use code generation tools to create run-time code for the Pi.

  2. Create a web server to provide this information to any user

Skills

familiar with image processing, MATLAB or OpenCV, embedded programming, web servers

No. of Students

1


Juxi Leitner's Projects

To find out more about me and my research projects, have a look at http://Juxi.net 
Generally my interest in making robots more adaptive and autonomous, focussing on how to integrate perception with the action side of things (see Vision and Action in the ACRV).

Emailj.leitner@qut.edu.au

 

TitleDesign and Development of a Robotic Lunar Payload

Description

The Google Lunar XPrize team offers the opportunity to send a smallCubesatsized payload to the moon. The idea of this project is to design and develop a functioning integrated system in a 10x10x10cm cube with less than 1.3kg to perform some measurements/science operations, based on the ArduSat platform.

 

We have already built a basic system, there is room for collaboration with CSIRO and other partners.

 

http://tinyURL.com/QUTLunaRoo (other related pages: http://www.ardusat.com http://ptscientists.com/go/space)

Things we could really need a hand with are:

  • structural design for hopping robots (stress analysis and material evaluation)
  • electronics design for space probes /cubesats
  • thermal management for electronics in space
  • integration of vision algorithms in small hopping robots
  • design of a common hopping platform (10cm cubed) together with CSIRO
  • ...

 

Skills

motivation for space, high GPA, Mechanics, Electronics, Programming, Maths

Students

2-4

 

 

TitleMaking Pepper Robots Better

Description

Together with SoftBank Robotics Europe we are interested in improving the state of the art in humanoid robots.

Possible starting points and shared interests with SBR Europe are:

There might also be possible visits to SBR Europe in Paris or inria in Rennes, which also have Peppers and a ROMEO

Skills

motivation for building real world robotic systems, high GPA, Programming, Maths, bit of Mechanics, Electronics

Students

1-2

Collaborationtogether with Sofbank Robotics Europe in Paris

 

TitleLow Cost (possibly Soft Robotics) End-Effector for Robotic Manipulation

Description

There are many projects (some open source some less so) that are building cheap(er) robotic end-effectors. From anthropologically inspired hands to pneumatically driven octopus-like suckers.

The plan for this project is to investigate what is out there and come up with a plan to build a cheap system over the course of the final year project, leading to a test of the system at the end of the year within the Amazon Robotics Challenge framework.

Possible starting points:

Skills

motivation for tinkering, high GPA, Mechanics, Electronics, Programming, Maths

Students

1-2

Collaborationtogether with Chris Lehnert

 

 Low Cost Printable Arm for Robotic Manipulation

Description

There now a range of projects (some open source some less so) that are printing and building cheap(er) robot arm.

The plan for this project is to investigate potential designs and over the course of the final year project select & improve one of these designs for addition to a pioneer robots.This may also involve working with the above project for developing a 3D printed robot hand.

Project Goals:

Skills

3D design & CAD skills, motivation for tinkering, high GPA, Mechanics, Electronics, Programming, Maths

Students

1

SupervisorsSteven Martin and Juxi Leitner

Jason Ford's Projects

TitlePower pole inspection automation
SupervisorsFord and Mcfadyen

Description

This project will developed a simulation environment of at least 3 power poles in a segment of a power-line network, and then developed controller for a simulated rotorcraft UAVs that automated wire following for transition of the UAV between power poles.  The simulation environment is to be developed in Gazebo and Robotic Operating system (ROS). The line following automation will need to consider wire sag and similar between the poles.

Skills

C++ and/or python coding skills, ability to work with ROS.

Students

1

TitleHigh-performance UAV motion planning and control for passage through tight gaps.
SupervisorsFord

Description

This project will develop high-performance UAV control that exploit new tensor decompositions to overcome the curse of dimensionality that hinders the dynamic programming approach to non-linear optimal control problems.  These tensor decompositions effectively “compress” the representation of optimal solution on to a low-dimension space, and allow the practical representation and calculation of the optimal controller.  In this  project, the performance of optimal control solutions arising from the tensor representations will be compared with the performance of cascade control approaches for UAV control (initially compared in simulated environments, but if sufficient progress is made then flight testing may occur).

Skills

High GPA student only GPA > 6.5. Desired skills: C++ Computer programming and mathematic skills essential.  Best candidate would have both maths and engineering training.

Students

1

TitleUFO detection: a computationally efficient ROS node for vision based unknown aircraft (flying object) detection
SupervisorsFord and James

Description

Vision based aircraft detection is a challenging problem due to the small size of the aircraft in a visually cluttered sensing environment. This project will develop a ROS node for vision based aircraft detection. Techniques investigated will include morphological processing and Viterbi algorithm UFO detection approaches.  This project will involve working in ROS with C++ and OpenCV libraries and/or python.

Skills

Desired skills: C++ programming and experiences in implementing efficient computational algorithms.

Students

1

TitleTraffic/multi-vehicle configuration modelling via Markov Chains
SupervisorsMcfadyen and Ford

Description

This work involves modelling traffic configurations (such as air traffic) as a Markov chain to enable dynamic allocation of traffic-free routes or volumes. The models capture the spatial-temporal nature of traffic movements and are useful in the design and assignment of routes for automatons systems such as unmanned aircraft or drones. The project requires strong mathematical modelling skills and computer programming (MATLAB). Students with a background in signal processing, control system design and/or estimation are encouraged to apply.

Skills

Desired skills: C++ programming and experiences in implementing efficient computational algorithms.

Students

1

TitleInvestigation of Stochastic MPC (Model Predictive Control)
SupervisorsMcfadyen and Ford

Description

This work involves system modelling and control design for non-linear systems by combining visual servoing and stochastic model predictive control techniques. The project will initially focus on simulation studies followed by implementation using the visual servoing playpen currently under development. The project requires strong mathematical modelling skills and computer programming (MATLAB and ROS). Students with a background or interest in control system design are encouraged to apply.

Skills

Desired skills: C++ programming and experiences in implementing efficient computational algorithms.

Students

1

Title

 

Deep learning approach to vision based aircraft detection during mid-air collision.
SupervisorsFord and James

Description

 

QUT has developed world-leading technology for vision based mid-air aircraft collision technology that has been flight tested on manned aircraft and Insitu’s ScanEagle UAS.  https://eprints.qut.edu.au/100005/.
This project will examine suitability of deep learning based approaches to vision based aircraft detection to replace the current state of the art.   Learning classifier for both aircraft and cloud artefact features.  Must detect aircraft at different ranges (i.e. difference sizes and visual appearance.  Target is detection of aircraft sized 2 to 50 pixels wide.  Grey scale imagery.  Should be able to track through time and frames are the aircraft approaches.  Deliver software (end to end detection system), plus report.

Skills

Desired skills: Python and/or C/C++, OpenCV. Experience in computer vision algorithms are an advantage.

Students

1

Title

UAV Mid-air collision simulation environment. (New for sem 2 - 2017)

SupervisorsFord and Molloy

Description

QUT has developed world-leading technology for vision based mid-air aircraft collision technology that has been flight tested on manned aircraft and Insitu’s ScanEagle UAS. https://eprints.qut.edu.au/100005/
This project will investigate automation of aircraft collision avoidance for small to medium size fixed wing unmanned aircraft, and develop an appropriate simulation visualisation framework.  The project will need to consider collision avoidance of multiple aircraft, realistic flight dynamics, rules of the air, and the need for the aircraft to return to mission after successful avoidance.  The project may also consider automation of aircraft during lost-link scenarios. This investigation will occur in matlab/Simulink

Skills

Desired skills: Simulink/Matlab, modelling of dynamic systems, and control.

Students

1

 

 

Chris Lehnert's Projects

Project details below are brief, contact me to discuss. If you email me please be sure to put "BEB project" in the subject line, that way I'll notice it.

Emailc.lehnert@qut.edu.au

1: Torque/Force control for applications such as robotic fruit harvesting

 

Title

Torque/Force control for applications such as robotic fruit harvesting

Description

This project aims to develop a torque control method allowing a robot arm to control the torque experienced in its environment. This includes the application of harvesting delicate fruits using torque/force control in order to harvest fruit without damage. This project will require the application of robot control theory and will have the ability to test on a real 6DoF robot arm designed for robotic fruit harvesting.

Skills

Desired skills: Knowledge of robot control theory, C++ programming and in implementing efficient computational algorithms.

Students

1

2: Design of a Novel Omnidirectional Spherical Wheel

 

Title

Design of a Novel Omnidirectional Spherical Wheel

Description

We have developed a continuous isotropic spherical omnidirectional drive mechanism that is efficient in its mechanical simplicity and use of volume. Spherical omnidirectional mechanisms allow isotropic motion, although many are limited from achieving true isotropic motion by practical mechanical design considerations. A prototype platform was built using a combination of machining and 3D plastic printing and is illustrated in the images above. This project aims to develop the next prototype improving the performance and addressing some flaws in the design.

Skills

Desired skills: Skills in Electro-mechanical Design, Maths, Dynamics and Control Theory

Students

1

3: Plant classification for fruit harvesting

Title

Plant classification for fruit harvesting

Description

The aim of this project is to create a system for detecting and classifying key parts of plants for use in robotic fruit harvesting. For a robot to successfully harvest fruit it needs to understand what parts of the plant should be avoided, such as the main stem or branches of a plant. This work will look at developing a classification system that can identify the key parts of the plant for use on a robotic fruit harvester that has been developed at QUT. 

Skills

Desired skills: Robotic Vision, Machine Learning, C++ or Python.

Students

1

4: RoboRoo: Robot Kangaroo using Parallel Variable Elastic Actuators


 

Title

RoboRoo: Robot Kangaroo using Parallel Variable Elastic Actuators

Description

The aim of this project is to create a novel legged robot that uses similar mechanics to a kangaroo. Kangaroos have an impressive transport efficiency at high speeds (over 10km/h) and this is due to the design of their legs. This project will look at replicated this mechanism and creating a legged robot that has a very low cost of transport.  

Skills

Desired skills: Robotic Vision, Machine Learning, C++ or Python.

Students

1

Collaborationjoint interest from Juxi Leitner

Michael Milford's Projects

Project details below are brief, contact specified person to discuss. If you email me please be sure to put "BEB project" in the subject line, that way I'll notice it.

SupervisorsMilford, Lehnert and Mount

Description

Design of a small scale "smart motor" for to allow the rapid prototyping of small mobile platforms for enabling research and education.

Skills

Desired skills: Mechanical Design, Electronics/PCB Design, C/C++ Programming, Understanding of Control Systems

Students

1

ContactJames Mount: j.mount@qut.edu.au

Frederic Maire's Projects

 

Thierry Peynot's Projects

1: Towards Robotic Knee Arthroscopy

Description

In this project you will contribute to the development of a robotic system to perform knee arthroscopy (minimally-invasive surgery). In the short future the system will first assist a surgeon, then in the long term it should be capable of performing the arthroscopy fully autonomously. The current system is composed of a robotic arm manipulating an arthroscope device (mini camera with optics at the end of a small stick, see picture above). A number of aspects need to be investigated, including (but not limited to):

  • Automatic adjustment of the arthroscope camera settings for best-quality viewing and/or optimal performance of vision algorithms
  • Automatic arthroscope calibration and hand-eye calibration
  • Experimental evaluation of the performance of different 3D reconstruction and SLAM algorithms, in the arthroscopy context
  • Accurate and Reliable Tracking of the arthroscope and the configuration of the knee using an OptiTrack system
  • etc.

Skills

  • Some knowledge of Computer Vision and/or Robotics
  • Matlab and/or C/C++ programming

No. of Students

1-2

2: Reliable Scene Understanding in Mining Environments

Description

This project concerns the evaluation and the development of scene understanding techniques that can be appropriate, reliable and effective in mining environments. An example is the detection (and potentially tracking) of dynamic objects in the environments, such as people and vehicles. Computer vision is of particular interest, but techniques considered could also use data acquired by LIDARs, RADARs or other sensors.

Skills

  • Knowledge Computer Vision and/or Signal Processing
  • Solid Matlab, Python or C/C++ programming
  • Knowledge of the Image Processing Toolbox (Matlab) or OpenCV would be a plus

No. of Students

up to 3

3: What can we learn from the differences of perception between sensor modalities?

Description

Combining different sensor modalities can often lead to more reliable perception, if the sensor data is interpreted appropriately. For example, a visual camera would be affected by the presence of smoke, while an infrared camera could allow the robot to see through it. However, these differences of perception are not often exploited as such, and fully understood. This project will focus on these differences of perception and investigate what can be learnt from them.

Skills

  • Knowledge of Computer Vision and/or Signal Processing
  • Some knowledge of sensors such as Camera, LIDAR, RADAR
  • Solid Matlab, Python or C/C++ programming
  • Preferred: Good knowledge of machine learning algorithms
  • Preferred: Knowledge of the Image Processing Toolbox (Matlab) or OpenCV

No. of Students

1

 

4: Can RADAR perception be Vision with microwaves?

 

Description

This project will use deep learning techniques to train a perception system to be (almost) as good with RADAR as it cane be with a visual camera

Skills

  • Knowledge of Computer Vision and/or Signal Processing
  • Some knowledge of cameras and RADAR
  • Solid Matlab, Python or C/C++ programming
  • Solid knowledge of machine learning algorithms

No. of Students

1-2

5: Towards an Autonomous Astrobiologist Rover 

(Automatic Stromatolite Recognition)

Description

Astrobiologists look for signs of life on other planets, such as Mars. In particular, they hope to find stromatolites, i.e. rock structures that were formed by a biogenic process. The end goal of this project is to give a planetary rover the ability to help astrobiologists with this mission, by autonomously detecting stromatolites using computer vision. In this component of the project, the student(s) will investigate, develop and test algorithms that can detect and recognise characteristics associated with biogenicity (can you do it on the picture above?).

Skills

  • Computer Vision
  • Solid Matlab or C/C++ programming
  • Knowledge of the Image Processing Toolbox (Matlab) or OpenCV

No. of Students

1 or 2

  • No labels