3D Trajectory Synthesis and Control for a Legged Swimming Robot


We are interested in making underwater robots agile and intuitively controllable by human divers. This will allow a diver to interact with their robot to execute complex underwater exploration missions as safely and efficiently as possible. Imagine cave diving or shipwreck exploration scenarios. They are risky for divers without extensive and rigorous training. Small, agile robots, able to maneuvre in 3D can execute the diver's commands accurately as long as they are communicated clearly. This project is a step towards this direction.

3D Autopilot, Agility, and Range of Motions

Here the Aqua2 robot is executing a range of pre-scripted choreographies meant to illustrate the agility, quick response times, and the capabilities of the 3D autopilot that enables the robot to maintain any orientation in space. Tuning of the controller parameters that enable these motions was done autonomously underwater: an optimization procedure modifies the set of parameters, executes the motion for some time, and evaluates the resulting error. The parameter set corresponding to the lowest error is saved for later experiments.

Trajectory Specification via Augmented Reality

The human operator can easily "draw" a desired 3D trajectory by moving a fiducial tag in space. These trajectories can be turned into control commands that the robot should execute, provided that it can adjust them to account for noise and the presence of obstacles. The operator can have visual feedback of the trajectory when operating on land. Exactly the same interaction scheme can be used underwater, where a diver can move the tags in 3D space to indicate a desired trajectory. The difference is that visual feedback of what is being drawn becomes more difficult.

Terrain-based Programming

Fiducial tags are not only useful to the human operator for drawing trajectories. They can also be useful to the robot. Tags can encode commands, behaviors, warnings, akin to traffic signs. When the robot sees them it can incorporate their suggested behavior as well as metric information about their place in the environment into its trajectory. In the experiment depicted above the robot is traversing a section of a shipwreck. Tags have been placed on the sea bottom and on the ship to indicate for instance: "sharp vertical turn ahead," so that the robot can avoid crashing onto the ship, and "U-turn ahead" so that the robot returns back to the diver.