A Visual Servoing Based Path Following Controller

Cole Dewis - dewis@ualberta.ca

Demos

Below are some demo videos of the algorithm. It is applied to a sanding task and a de-mining task on a mobile robot base:

Paper

To see the derivations, controller equations, and performance analysis, the full paper can be accessed here: Paper Link

Abstract

Visual servoing has been well explored in the literature for task specification and planning in image space. Planning tasks and paths in image space can be especially useful in unstructured environments, as a 3D reconstruction is not needed. However, few works have discussed following arbitrary image paths with visual servoing for robotic arms. This paper presents a path following controller for robotic arms based on image based visual servoing that can follow arbitrary paths in image space. The controller uses visual error to generate velocities that smoothly approach the path along the tangent. Additionally, the controller can optionally follow the orientation of the path, and can be applied to both eye-in-hand and eye-to-hand setups. Experiments are conducted on a Kinova Gen3 7DOF arm to evaluate the controller. Benefits of the path following controller over a trajectory-tracking approach are shown. Specifically, our path following controller displays smooth responses to physical disturbances and forced pauses.

Methods

We develop a visual servoing based path following controller to allow robot arms to follow arbitrary paths or contours. This allows tasks to be specified as paths in image space. This is desirable as it avoids the need for expensive 3D reconstructions of the environment. Our controller is designed to be applicable to both eye-in-hand and eye-to-hand camera configurations, and thus can be easily applied to various configurations. By developing a path following controller (PFC) instead of a trajectory tracking controller (TTC), our movements are able to be more robust to delays and disturbances, as the target at each time step is determined by the closest state to the path in a PFC. In contrast, a TTC has drawbacks due to its use of a timed reference, as disturbances can cause poor behavior due to the robot falling behind the reference.

As the controller is based on visual servoing, the control law takes the current image position of the robot and the target path point and uses the error to generate velocities. These velocities are weighted to both bring the robot towards the path and along the path, with the error magnitude specifying the weighing of these two terms. The PFC framework means that we set the target point at each iteration to the closest state on the path to our current state rather than a timed reference, and this combined with the two control terms encourages a smooth approach tangent to the path, both for initial approach and re-approach if a disturbance occurs.