Teaching

I typically teach robotics, at both undergraduate and graduate levels.

In 2017-18, I taught CMPUT 412 in the Winter Term (January 2018). Both courses deal with algorithms for autonomous robot navigation, with an emphasis on using computer vision. CMPUT 412 focuses on experimentation on physical robots whereas CMPUT 631 studies the theoretical basis of the robot navigation algorithms. We use ROS (Robot Operating System) in both courses, and program robots using either Python or C++. For an overview of CMPUT 412, see two the following two videos made about the course in the last two years.
From Winter 2017

and from Winter 2018

CMPUT 412 - Experimental Mobile Robotics (Winter 2018)

Overview

This course is an experimental inquisition of the fascinating field of mobile robotics. Students will specifically be exposed to basic concepts, models and algorithms in autonomous navigation of a mobile robot operating in indoor environments. The instruction of the course will be through robot operating system (ROS), which provides libraries and tools to help software developers quickly create robot applications. A mobile robot vehicle will be provided to the students, to conduct real experiments. It is equipped with a range sensor as well as a camera for the robot to perceive its environment and make navigational decisions. The course will be developed incrementally, through weekly demos and competitions. The robot tasks that define the demos and competitions will begin with simple motion control, and evolve toward a comprehensive term project involving robot mapping, localization, path planning, object detection and homing.

Objective

The objective of the course is for the student to gain an understanding of the basics in programming a mobile robot so that the student will be able to develop software for a physical mobile robot using the popular the robot operating system or ROS. The students will learn how to model a robot physically and how to make use of sensory data for a robot to model its environment and navigate in the environment efficiently in a collision-free manner. The student will also be exposed to general issues and concepts in robotics.

Course Topics

  • Robotics
  • Mobile robot kinematics
  • Robot Operating System (ROS)
  • Lidar and RGB-D camera
  • Motion control of a mobile robot
  • Simultaneous localization and mapping (SLAM)
  • Path planning
  • Visual docking
  • Visual teach and repeat

Course Work and Evaluation

There are no homework and exams. Performance evaluation will be based on the following three components.

  1. Seven demonstrations at 5% each, for a total of 35%
  2. Four competitions, 3 at 10% and 1 at 20%, for a total of 50%
  3. Competition reports, 3 at 3% each and 1 at 6%, for a total of 15%

Students work in groups of two. Demonstrations require each group to complete a robot task using solutions that are discussed in the lectures, textbook or online resources. Competitions are extensions and integrations of the functionalities achieved in the demos and student groups are ranked and given marks based on their placement with respect to other groups. Each group is required to maintain a Google site webpage that provides a report of their competition solutions. The schedule of the demos and the competitions is available in the calendar section of the course eClass page.

The two members of a group are expected to contribute equally to the completion of the above course components, in order for them to receive the same marks. Exchange of information among groups is encouraged subject to Code of Student Behavior defined in Academic Integration of Undergraduate Handbook.

Textbook

Programming Robots with ROS: A Practical Introduction to the Robot Operating System, by Morgan Quigley, Brian Gerkey, and William D. Smart, First Edition, 2015, O'Reilly.

CMPUT 631 - Autonomous Robot Navigation (Winter 2017)

Overview

This course is concerned with the subject of autonomous robot navigation. The students will become familiar with related mobile robotics research and study a number of classical and modern algorithms. Specifically, the course will focus on how a mobile robot builds a map and localizes itself in that map at the same time (the so-called SLAM problem), by making use of the information collected by its sensors such as laser range finders and cameras. The lectures will introduce both basic and advanced SLAM algorithms, and the students will gain an in-depth understanding of these algorithms by both reading research papers and examining their software implementations. Class lectures and homework assignments will rely on the Robot Operating System (ROS) - which provides libraries and tools to help software developers quickly create robot applications - to control robots in simulated environments and study SLAM algorithms on benchma rk datasets.

Course Topics

  • Introduction to robotics
  • Robot Operating System (ROS)
  • Mobile robot kinematics
  • Sensors: lidars and cameras
  • Extended Kalman filter and its application to robot localization
  • Particle filter and its application to simultaneous localization and mapping (SLAM)
  • Graph SLAM and nonlinear optimization
  • Place recognition and loop closure detection
  • Visual odometry and visual SLAM

Prerequisites

Graduate student status in Computing Science or consent of the instructor; personal Linux (running Ubuntu 14.02) computer and familarity with installing and developing software (in either Python or C++) on Ubuntu.

Text (Recommended)

Quigley, Morgan, Brian Gerkey, and William D. Smart. Programming Robots with ROS: A Practical Introduction to the Robot Operating System, O'Reilly Media, Inc., 2015.

Readings

  1. Durrant-Whyte, Hugh and Bailey, Tim. "Simultaneous Localisation and Mapping (SLAM): Part I", IEEE Robotics and Automation Magazine, June 2006.
  2. Cadena, Cesar, et al. “Past, present, and future of simultaneous localizatio n and mapping: Towards the robust-perception age.” IEEE Transactions on Robotics 32.6 (2016): 1309-1332.
  3. Thrun, Sebastian. “Particle filters in robotics.” Proceedings of the Eightee nth conference on Uncertainty in artificial intelligence. Morgan Kaufmann Publis hers Inc., 2002.
  4. Fox, Dieter, et al. "Monte carlo localization: Efficient position estimation for mobile robots." AAAI/IAAI 1999 (1999): 343-349.
  5. Grisetti, Giorgio, Stachniss, Cyrill, and Burgard, Wolfram. “Improved techniques for grid mapping with Rao-Blackwellized particle filters.” IEEE transactions on Robotics 23.1 (2007): 34-46.
  6. Grisetti, G., et al. “G2O: A general framework for graph optimization.” IEEE International Conference on Robotics and Automation. 2011.
  7. Grisetti, G., Kummerle, R., Stachniss, C., and Burgard, W. “A tutorial on gr aph-based SLAM.” IEEE Intelligent Transportation Systems Magazine, 2(4):31–43, 2 010.
  8. Rosten, Edward, and Tom Drummond. "Machine learning for high-speed corner detect ion." European conference on computer vision. Springer Berlin Heidelberg, 2006.
  9. Calonder, Michael, et al. "BRIEF: Binary robust independent elementary features." European conference on computer vision. Springer Berlin Heidelberg, 2010.
  10. Sivic, J. and Zisserman, A. "Video Google: A Text Retrieval Approach to Object M atching in Videos", Proceedings of the International Conference on Computer Visi on (ICCV), 2003.
  11. Galvez-Lopez, D. and Tardos, J. D. “Bags of binary Words for fast place reco gnition in image sequences.” IEEE Transactions on Robotics (TRO), 28(5):1188–119 7, October 2012.
  12. Fraundorfer, F. and Scaramuzza, Davide. “Visual odometry: Part II: Matching, robustness, optimization, and applications.” IEEE Robotics & Automation Magazin e 19.2 (2012): 78-90.
  13. Mur-Artal, Raul, J. M. M. Montiel, and Juan D. Tardos. “ORB-SLAM: a versatil e and accurate monocular slam system.” IEEE Transactions on Robotics 31.5 (2015) : 1147-1163.
  14. Lowry, Stephanie, et al. “Visual place recognition: A survey.” IEEE Transact ions on Robotics 32.1 (2016): 1-19.

References

  1. ROS Indigo, http://wiki.ros.org/indigo.
  2. Fallon, Maurice, et al. “The MIT Stata Center Dataset.” The International Jo urnal of Robotics Research, 32(14): 1695-1699, December 2013.
  3. Giorgio Grisetti, “An incomplete scan matching tutorial”, unpublished and un dated.
  4. ROS Cheat Sheet (Indigo v2.0), Clearpath Robotics, 2015.

Evaluation

    Student evaluation is based on five assignments (45%), one midterm (25%) and the course project (30%).

top