I recently had the opportunity to join the Duke X-Prize Shell Ocean Discovery X-Prize team. Teams compete for a $7 million prize and an additional $1 million bonus prize. The goal of the competition is to successfully map 50 square miles of the ocean floor and generate 1. a high resolution bathymetric map, 2. images of a specified object, and 3. identify archaeological, biological or geological features.
The team has made it through the first round of cuts, and the field has been narrowed down to 10 teams out of the original 25. Next rounds of testing are scheduled for this Fall, with a full test of the system in mid October
Our approach to quickly mapping the ocean floor is this:
1. A Drone flies out to the ocean dropping off buoys in different sections of the planned mapping area
2. The buoys map the ocean using sonar
3. The drone picks up the buoys and returns them to base.
To navigate to the buoy autonomously, I have developed a multi-step approach and retrieval system. The first step for this system is to use GPS navigation, which should get the drone within a 30 foot radius to the buoy. Next I use a You-Only-Look-Once (YOLO) based object detection model to find the buoy in real time on the water. While the are computationally less intensive approaches that could be used, YOLO will give the most accurate results without having to account for water glare, object occlusion, dropped frames or the angle of the buoy on approach. Solving these issues were critical in developing a system that could make multiple passes at picking up the buoy while being robust enough to handle picking up a moving target on potentially choppy ocean waves.
Once the buoy is within close range (< 4 feet) the system switches to a fiducial tracking system to locate the buoy with exact X_Y_Z coordinates. Fiducials placed around the top of the buoy, and visible to the drones camera as it approaches for pickup. Because of the accuracy of the fiducial system, this will give us a fine grained approach to buoy acquisition.
Initial YOLO buoy tracking test:
Initial Fiducial tracking test:
I've developed both systems to run on the Jetson TX2 development board. This will allow the YOLO system to run on GPU, and keep the inference and frame rate relatively low. Inference for the YOLO system on a TX2 is around 8ms, with a frame rate around 18 FPS. Both systems return the XY coordinates of the object, and the fiducial system return an additional (rough) distance metric. These positions are passed to the flight-director system which sends PWM to the motors. The coordination between these systems needed to be fast, robust, and lightweight. I implemented the transfer between each system, along with the directional movement of the motors and steering of the drone in ZeroMQ. ZeroMQ is a lightweight messaging protocol designed with pub/sub modules. While there were more established options like Robot Operating System (ROS), they were ultimately more than what was called for in this project and offered an unnecessary layer of abstraction.
The development of this system is still in progress, and the system that we've implemented is relatively straight-forward.