Precision autonomous approach and landing using a novel vision and imitation learning method in GPS denied environment
The objective of the project is to develop a novel control methodology for autonomous approach and landing, which is inspired by the procedure that Navy pilots follow for landing on space-constrained ship decks by utilizing the “horizon bar” (a standardized visual reference). The idea is to utilize machine vision to track a portable standardized visual cue, which a soldier can easily carry and install anywhere, even on a moving vehicle or a ship (similar to the “horizon bar”). Then combine the relative position and heading information from the vision system with imitation-learning-based control algorithms to bring an experienced pilot’s skill and decision-making process to a complex maneuver such as landing on a small ship deck or a dense urban terrain. This method has been successfully implemented on a quad-rotor and demonstrated autonomous tracking and landing on a moving platform.
A demonstration of this technology in action onboard an autonomously landing quadcopter
Students: Bochan Lee, Vishnu Saj, Keerat Singh