Skip to end of metadata
Go to start of metadata

Supervisors please note that these projects are advertised and allocated through the inPlace system https://inplace.qut.edu.au/

 

Some of you may have encountered problems login to InPlace to add BEB801/2 project. A very common problem could be that you are not listed as InPlace user. For this reason, and any other problem you may encounter, please contact:  sef.wil@qut.edu.au or call ext. 80499 to sort it out.

Interested in doing a PhD

If you're interested in doing a PhD then consider one of these projects. It's a chance to see if you like the work/research environment before you commit 3 years of your life to a PhD!

 

We have a great range of projects in mechatronics, robotics and computer vision.  Please directly contact the supervisors associated with the projects.

Peter Corke's Projects

About Peter. Project details below are brief, contact me to discuss. If you email me please be sure to put "BEB project" in the subject line, that way I'll notice it.  

1: Robot voice interface

Description

Voice input is becoming ubiquitous with products like Google Home and Amazon Echo/Dot. These products have voice understanding APIs as do other players such as IBM and Microsoft. This project is about leveraging consumer voice tech for robotics, to create an interface that lets you give voice commands to an arm robot in a table top environment, eg "pick up the blue block and put it next to the red block".

The following tasks are required

  1. Survey the available tech and choose the most suitable one

  2. Familarize yourself with the robot, its table top working environment, and the vision system.
  3. Build an impressive voice control demonstration.
  4. Can you extend this to a dialog? If there are two blocks can the system know this and ask "which blue block do you mean?".

Skills

familiar with robotics, image processing, web services, ROS, MATLAB, Python.

No. of Students

1

2: Solutions manual for Robotics, Vision and Control

Description

This book, and its predecessor, are widely used for teaching. The book has lots of end-of-chapter problems but there are no answers! There is an historic opportunity to address this, and along the way learn a great deal of robotics. The solutions manual could be a traditional manual (probably created using LaTeX) or a bunch of MATLAB LiveScripts or both.

The following tasks are required

  1. Develop a framework for the project on GitHub, naming conventions, folder structure etc.

  2. Become familiar with the LaTeX environment and book-specific macros and conventions.
  3. Develop the solutions, using the woefully incomplete existing solutions manual, as an example.

Skills

Familiar with MATLAB, writing and explaining, robotics, computer vision, willingness to learn

No. of Students

1

3: Porting the robotics toolbox to another language

Description

The Toolbox is widely used around the world for teaching and research. Although it is open-source it requires a MATLAB licence in order to use it. There is an opportunity to port the Toolbox to other free and open languages such as Python or Julia. A partial port to Python already exists which could be built on to include mobile robot capability.

The following tasks are required

  1. Choose the language and decide on how to map the MATLAB syntax consistently to the new language.

  2. Port the code, documentation and comments to the new language.
  3. Develop unit tests.
  4. Release it to the world via GitHub.

Skills

Familiar with MATLAB, writing and explaining, robotics , computer vision, willingness to learn

No. of Students

multiple projects possible.


Juxi Leitner's Projects

To find out more about me and my research projects, have a look at http://Juxi.net 
Generally my interest in making robots more adaptive and autonomous, focussing on how to integrate perception with the action side of things (see Vision and Action in the ACRV).

Emailj.leitner@qut.edu.au

 

TitleDesign and Development of a Robotic Lunar Payload

Description

The Google Lunar XPrize team offers the opportunity to send a smallCubesatsized payload to the moon. The idea of this project is to design and develop a functioning integrated system in a 10x10x10cm cube with less than 1.3kg to perform some measurements/science operations, based on the ArduSat platform.

 

We have already built a basic system, there is room for collaboration with CSIRO and other partners.

 

http://tinyURL.com/QUTLunaRoo (other related pages: http://www.ardusat.com http://ptscientists.com/go/space)

Things we could really need a hand with are:

  • structural design for hopping robots (stress analysis and material evaluation)
  • electronics design for space probes /cubesats
  • thermal management for electronics in space
  • integration of vision algorithms in small hopping robots
  • design of a common hopping platform (10cm cubed) together with CSIRO
  • ...

 

Skills

motivation for space, high GPA, Mechanics, Electronics, Programming, Maths

Students

2-4

 

 

TitleMaking Pepper Robots Better

Description

Together with SoftBank Robotics Europe we are interested in improving the state of the art in humanoid robots.

Possible starting points and shared interests with SBR Europe are:

There might also be possible visits to SBR Europe in Paris or inria in Rennes, which also have Peppers and a ROMEO

Skills

motivation for building real world robotic systems, high GPA, Programming, Maths, bit of Mechanics, Electronics

Students

1-2

Collaborationtogether with Sofbank Robotics Europe in Paris

 

TitleLow Cost (possibly Soft Robotics) End-Effector for Robotic Manipulation

Description

There are many projects (some open source some less so) that are building cheap(er) robotic end-effectors. From anthropologically inspired hands to pneumatically driven octopus-like suckers.

The plan for this project is to investigate what is out there and come up with a plan to build a cheap system over the course of the final year project, leading to a test of the system at the end of the year within the Amazon Robotics Challenge framework.

Possible starting points:

Skills

motivation for tinkering, high GPA, Mechanics, Electronics, Programming, Maths

Students

1-2

Collaborationtogether with Chris Lehnert

 

 Low Cost Printable Arm for Robotic Manipulation

Description

There now a range of projects (some open source some less so) that are printing and building cheap(er) robot arm.

The plan for this project is to investigate potential designs and over the course of the final year project select & improve one of these designs for addition to a pioneer robots.This may also involve working with the above project for developing a 3D printed robot hand.

Project Goals:

Skills

3D design & CAD skills, motivation for tinkering, high GPA, Mechanics, Electronics, Programming, Maths

Students

1

SupervisorsSteven Martin and Juxi Leitner 


Chris Lehnert's Projects

Project details below are brief, contact me to discuss. If you email me please be sure to put "BEB project" in the subject line, that way I'll notice it.

Emailc.lehnert@qut.edu.au

1: Torque/Force control for applications such as robotic fruit harvesting

 

Title

Torque/Force control for applications such as robotic fruit harvesting

Description

This project aims to develop a torque control method allowing a robot arm to control the torque experienced in its environment. This includes the application of harvesting delicate fruits using torque/force control in order to harvest fruit without damage. This project will require the application of robot control theory and will have the ability to test on a real 6DoF robot arm designed for robotic fruit harvesting.

Skills

Desired skills: Knowledge of robot control theory, C++ programming and in implementing efficient computational algorithms.

Students

1

2: Design of a Novel Omnidirectional Spherical Wheel

 

Title

Design of a Novel Omnidirectional Spherical Wheel

Description

We have developed a continuous isotropic spherical omnidirectional drive mechanism that is efficient in its mechanical simplicity and use of volume. Spherical omnidirectional mechanisms allow isotropic motion, although many are limited from achieving true isotropic motion by practical mechanical design considerations. A prototype platform was built using a combination of machining and 3D plastic printing and is illustrated in the images above. This project aims to develop the next prototype improving the performance and addressing some flaws in the design.

Skills

Desired skills: Skills in Electro-mechanical Design, Maths, Dynamics and Control Theory

Students

1

3: Plant classification for fruit harvesting

Title

Plant classification for fruit harvesting

Description

The aim of this project is to create a system for detecting and classifying key parts of plants for use in robotic fruit harvesting. For a robot to successfully harvest fruit it needs to understand what parts of the plant should be avoided, such as the main stem or branches of a plant. This work will look at developing a classification system that can identify the key parts of the plant for use on a robotic fruit harvester that has been developed at QUT. 

Skills

Desired skills: Robotic Vision, Machine Learning, C++ or Python.

Students

1

4: RoboRoo: Robot Kangaroo using Parallel Variable Elastic Actuators


 

Title

RoboRoo: Robot Kangaroo using Parallel Variable Elastic Actuators

Description

The aim of this project is to create a novel legged robot that uses similar mechanics to a kangaroo. Kangaroos have an impressive transport efficiency at high speeds (over 10km/h) and this is due to the design of their legs. This project will look at replicated this mechanism and creating a legged robot that has a very low cost of transport.  

Skills

Desired skills: Robotic Vision, Machine Learning, C++ or Python.

Students

1

Collaborationjoint interest from Juxi Leitner

Michael Milford's Projects

Project details below are brief, contact specified person to discuss.

TitleAutonomous Cars, Big and Small
SupervisorsProfessor Michael Milford

Description

Work on a range of autonomous car-related projects, including full scale solar autonomous cars and miniature autonomous cars, including closing the loop between perception and control, path planning, pedestrian and sign detection, lane following.

Skills

Desired skills:

Significant experience and/or high GPA preferred, and some combination of coding, computer vision, machine learning or mechatronic hardware experience required.

Students

5

ContactMichael Milford: michael.milford@qut.edu.au. Please send a CV with GPA and relevant experience, coding and hardware skills and experience.

TitleCamera-based Positioning Technology for Robots and Autonomous Vehicles
SupervisorsProfessor Michael Milford

Description

Develop algorithms using computer vision and machine learning for accurately working out the position of a robot or vehicle relatively to a reference visual map.

Skills

Desired skills:

Significant experience and/or high GPA preferred, and some combination of coding, computer vision, machine learning or mechatronic hardware experience required.

Students

1

ContactMichael Milford: michael.milford@qut.edu.au. Please send a CV with GPA and relevant experience, coding and hardware skills and experience.

TitleAutomated Storybook Writer
SupervisorsProfessor Michael Milford

Description

Develop algorithms to automatically write simple storybook titles.

Skills

Desired skills:

Significant experience and/or high GPA preferred, and some combination of coding, computer vision, machine learning and education / tutoring / teaching experience required.

Students

1

ContactMichael Milford: michael.milford@qut.edu.au

Frederic Maire's Projects (← click on the link to get to the list) 

 

Thierry Peynot's Projects

1: Towards Robotic Knee Arthroscopy

Description

In this project you will contribute to the development of a robotic system to perform knee arthroscopy (minimally-invasive surgery). In the short future the system will first assist a surgeon, then in the long term it should be capable of performing the arthroscopy fully autonomously. The current system is composed of a robotic arm manipulating an arthroscope device (mini camera with optics at the end of a small stick, see picture above). A number of aspects need to be investigated, including (but not limited to):

  • Automatic adjustment of the arthroscope camera settings for best-quality viewing and/or optimal performance of vision algorithms
  • Experimental evaluation of the performance of different 3D reconstruction and SLAM algorithms, in the arthroscopy context
  • Accurate and Reliable Tracking of the arthroscope and the configuration of the knee using an OptiTrack system
  • (Constrained) Visual servoing of an arthroscope to follow a surgeon's tool
  • etc.

Skills

  • Some knowledge of Computer Vision and/or Robotics
  • Matlab and/or C/C++ programming

No. of Students

1-2

2: Reliable Scene Understanding in Mining Environments

Description

This project concerns the evaluation and the development of scene understanding techniques that can be appropriate, reliable and effective in mining environments. An example is the detection (and potentially tracking) of dynamic objects in the environments, such as people and vehicles. Computer vision is of particular interest, but techniques considered could also use data acquired by LIDARs, RADARs or other sensors.

Skills

  • Knowledge Computer Vision and/or Signal Processing
  • Solid Matlab, Python or C/C++ programming
  • Knowledge of the Image Processing Toolbox (Matlab) or OpenCV would be a plus

No. of Students

up to 3

3: What can we learn from the differences of perception between sensor modalities?

Description

Combining different sensor modalities can often lead to more reliable perception, if the sensor data is interpreted appropriately. For example, a visual camera would be affected by the presence of smoke, while an infrared camera could allow the robot to see through it. However, these differences of perception are not often exploited as such, and fully understood. This project will focus on these differences of perception and investigate what can be learnt from them.

Skills

  • Knowledge of Computer Vision and/or Signal Processing
  • Some knowledge of sensors such as Camera, LIDAR, RADAR
  • Solid Matlab, Python or C/C++ programming
  • Preferred: Good knowledge of machine learning algorithms
  • Preferred: Knowledge of the Image Processing Toolbox (Matlab) or OpenCV

No. of Students

1

 

4: Can RADAR perception be Vision with microwaves?

Description

This project will use deep learning techniques to train a perception system to be (almost) as good with RADAR as it can be with a visual camera

Skills

  • Knowledge of Computer Vision and/or Signal Processing
  • Some knowledge of cameras and RADAR
  • Solid Matlab, Python or C/C++ programming
  • Solid knowledge of machine learning algorithms

No. of Students

1-2

5: Towards an Autonomous Astrobiologist Rover 

(Automatic Stromatolite Recognition)

Description

Astrobiologists look for signs of life on other planets, such as Mars. In particular, they hope to find stromatolites, i.e. rock structures that were formed by a biogenic process. The end goal of this project is to give a planetary rover the ability to help astrobiologists with this mission, by autonomously detecting stromatolites using computer vision. In this component of the project, the student(s) will investigate, develop and test algorithms that can detect and recognise characteristics associated with biogenicity (can you do it on the picture above?).

Skills

  • Computer Vision
  • Solid Matlab or C/C++ programming
  • Knowledge of the Image Processing Toolbox (Matlab) or OpenCV

No. of Students

1 or 2

  • No labels