Autonomous Drone Control System


This is a very brief article on the research I performed with Philip Graziani at Lake Superior State University School of Computer Science under Dr. Christopher Smith. I'd like to take a moment to thank Dr. Smith and LSSU for the research opportunity. It took us about 9 months to complete, and truly pushed our limits of understanding.

This project was written in C++ with the use of OpenCV (Open Computer Vision), and vJoy, a Virtual Joystick software.


Most autonomous tracking applications used by arial vehicles today rely on instrumented targets to perform basic tracking tasks. Our approach was to create a visual servoing application to control the autonomous arial vehicle using pressure snakes. Using only one camera mounted onto a quad copter via gimbal, we can track an object of any shape or size without the use of an instrument attached to the target.


With the advent of personal arial vehicles, the market for tracking applications has had significant growth. Arial vehicles’ uses now range from security applications to the recording of sports. As demand increases, so too increases the need for advancement in automation technologies for areas where instrumentation or piloting are either impractical or monetarily not feasible.

Corner-stoning this work is the research on active contour models performed by Christopher E. Smith, and Doug P. Perrin. Link

In our approach to autonomous tracking we overlay a pressure snake onto some discernible target, and then analyze movements of the snake to approximate target movement in the x, y, and z axis.

The work we developed off of was using the LAB color scheme, and would potentially lead to a ‘noisy environment’ in which the snake may contract, or expand in an unwanted manner. To ensure snake stability in a potentially noisy environment we manually fine-tune the LAB scales for the specified target in-addition to the implementation of a Kalman filter that is fed axial movement predictions to judge when the resetting of a snake overlay is required.

A snake overlaid onto a target during testing

We then correlate approximated target movements to specific flight controls by-way of a single-variable proportional control law customized to each axis. The proportional control can be expressed mathematically as:

Source : Link

Not having an SDK for the specific drone we were using, our final obstacle in this approach was to feed the vehicle commands. Luckily, the drone manufacturer had desktop software that allowed for the control of a drone via joystick. This is when we found vJoy, written by Shaul Eizikovich. Using this wonderful software we were able to programmatically send the manufacturer's software our joystick commands from the control law.


While we did end up crashing the drone quite a few times in testing, and broke more parts than I thought any drone could possibly have, we got our approach to this problem working. The only limit to our success was the range of which we were able to stream video to our software. While the hardware claimed to have a 2-mile reach, we were unable to get past 10 feet (Rather ridiculous.) Once the feed cuts out, the rate of change in snake size flags an emergency landing procedure (Coded only after it happened once and nearly took someone out.)