Available in 2024
Course code

MCHA4400

Units

10 units

Level

4000 level

Course handbook

Description

Vision-based sensors such as lidar, radar, and optical & thermal cameras are having an enormous impact on navigation of autonomous vehicles and mobile robotics. Navigation relies on accurate sensing models in combination with a 3D representation of the environment and a dynamic model of the vehicle and obstacles. Students who complete this course will acquire background geometric tools and kinematics for developing sensor likelihood models, which will be used within a Bayesian data fusion framework. This delivers estimates of the vehicle pose, pose-rate, and provides a map of the environment and includes topics such as simultaneous localisation and mapping (SLAM) and optic-flow egomotion (visual odometry).


Availability2024 Course Timetables

Callaghan

  • Semester 2 - 2024

Learning outcomes

On successful completion of the course students will be able to:

1. Define appropriate reference frames and coordinate systems to express world-fixed and body-fixed objects and their motion

2. Calibrate sensor likelihood functions based on calibration data by posing and solving a constrained optimisation problem

3. Design a landmark-based SLAM solution

4. Implement and validate a real-time landmark-based SLAM solution

5. Compute optic flow on the view sphere based on a sequence of images

6. Design an optic-flow based navigation solution

7. Implement and validate a realtime optic-flow based navigation solution


Content

Fundamentals of visual sensors

  • Revision of kinematics
  • Geometry of vision
  • Camera models (parametric and non-parametric)
  • Planar and spherical projections
  • Sensor calibration

Vision as pose sensor

  • Feature identification / extraction
  • Landmark management
  • Data association
  • Bundle adjustment
  • Review of nonlinear Bayesian filtering
  • Sparse extended information filter SLAM (SEIF-SLAM)

Vision as pose-rate sensor

  • Image flow (sparse and dense estimators)
  • Optic flow on the view sphere
  • Egomotion
  • Flow-based navigation

Assumed knowledge

MCHA4100 Mechatronics Systems, ENGG3300 Machine Learning for Engineers


Assessment items

Tutorial / Laboratory Exercises: Laboratory Exercise (x6)

Written Assignment: Visual SLAM

Written Assignment: Optic Flow Integration


Contact hours

Semester 2 - 2024 - Callaghan

Laboratory-1
  • Face to Face On Campus 2 hour(s) per week(s) for 13 week(s) starting in week 1
Laboratory-2
  • Face to Face On Campus 2 hour(s) per week(s) for 13 week(s) starting in week 1
Lecture-1
  • Face to Face On Campus 2 hour(s) per week(s) for 13 week(s) starting in week 1

Course outline

Course outline not yet available.