Depending on the operational environment of an autonomous system, a great deal of perceptual uncertainty may be introduced to an object detection model. For instance, changes in environmental lighting, reflections, lens flares, and other factors may require a system to be incredibly robust in order to avoid failure. Unfortunately, many machine learning techniques require a great deal of resources to obtain robustness, and this may not be feasible for some engineering teams.
Fiducial marker-based (or ArUco) detection offers a means of obtaining reliable performance without the need of machine learning techniques. However, in more complex and dynamic environments this technique may not be sufficient.
For this project, a joint marker detection, state estimation, and control algorithm was developed for AUAVs.
* Marker Detection: The algorithm first detects ArUco markers in the incoming visual data stream
* State Estimation: The relative transforms of the AUAV with respect to the marker is calculated
* Control: The output of the state estimation (relative transform) is utilized to generate a trajectory for aligning the UAV with the center of marker, and performing a subsequent gate maneuver.