Autonomous Drone Control System

Background:

During our senior year at Lake Superior State University we are given the opportunity to engage in a senior project to act as a capstone of our time spent as an undergraduate student. These projects typically engage students with local businesses who need software developed and allow them to get experience in all phases of designing, and implementing software. In my case, I was given the unique opportunity to participate in computer vision research with my fellow student Philip Graziani.

Project:

Instead of working with a local business, Mr Graziani and I worked under Dr. Christopher Smith’s guidance on a visual tracking and automation research project. Our task was to automate an arial vehicle to track any given selected object using an active deformable model. This model was derived from Dr. Smith, and Doug P. Perrin’s research on active contour models1. Dr. Smith provided the arial vehicle, and all necessary hardware for development, with no control software created. Existing code consisted of an OpenCV2 application that could overlay an active contour model onto webcam input.

Approach:

All seemingly complex problems can be boiled down to a grouping of simple problems. After analyzing what was before us, we were able to boil down our project to three separate tasks. The first and foremost being the projection of the active model’s movements in two dimensional space (movements on camera) to three dimensional space (approximation of real-world movements.) Secondly, we would need a way to control the arial vehicle programmatically. There was no available software development kit (SDK) available for us to use, so we had to get creative. The vehicle in-use came with ‘Ground Station’ software that enabled a user to control the vehicle with a computer interface instead of the standard remote control. We decided to construct our software around the idea of mimicking user control into this ‘Ground Station.’ Finally our third task, tie the two separate code bases together such-that the movements of the active model would be translated directly into vehicle controls.

Obstacles:

Each of the three tasks that we broke the project into had significant obstacles to overcome. With our first task, we had to ensure that we could account for ‘noisy’ environments. With a wireless camera feed and potentially volatile lighting conditions it was imperative that we create methods of ensuring that our model wouldn’t collapse. Due to the automated nature, a collapsed model could potentially be disastrous, and a danger to people near the vehicle. Using a Kalman3 filter we were able to discern when bogus frames were being fed into the system (that-is images with a high degree of noise.)

Controlling the vehicle presented an interesting challenge. We were able to find a free joystick driver ‘Virtual Joystick’ (vJoy) 4 that allowed us to programmatically simulate a joystick in the arial vehicle’s ‘Ground Station’ software. Essentially, vJoy enabled us to mimic a human pilot. While the premise sounds simple enough, we had to go through a multitude of boundary cases that could occur to ensure safe endings to flights if something goes wrong. In-addition to safety checks, we implemented a series of flight tests to ensure the accuracy of our programming before tying control to the deformable model.

Translating the movements detected by the deformable model into arial vehicle controls required the construction of a proportional control law5. This law allowed us to correspond movements detected by the deformable model to the proper position of a joy stick such-that the location of our vehicle maintained the same distance from our target object at any given time. During this stage it was imperative that we include yet another set of safety checks that would ensure rapid changes in our deformable model didn’t trigger rapid stick adjustments.

Testing:

As with any software developed, testing was key. Every single unit developed was tested in a simulated environment multiple times before live test flights. Even though we did multiple simulated flights, it would seem that reality was much different. We crashed our vehicle a multitude of times, and at one point the vehicle chased down a target with extreme prejudice; thankfully we tethered down the vehicle and nobody was injured.

References and links:

  1. Christopher E. Smith, Doug P. Perrin. Rethinking Classical Internal Forces for Active Contour Models 
    http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.17.9315&rep=rep1&type=pdf 
  2. OpenCV. Open Computer Vision. Designed for computational efficiency and with a strong focus on real-time applications.
    https://opencv.org/ 
  3. Kalman Filter. An algorithm for estimating unknown variables.
    https://en.wikipedia.org/wiki/Kalman_filter 
  4. Virtual Joystick. A device driver that bridges the gap between any device that is not a joystick and and application that requires a joystick.
    http://vjoystick.sourceforge.net/site/ 
  5. Proportional Control Law.
    https://en.wikipedia.org/wiki/Proportional_control
Joshbosley.com