Before we began work, this project was already started to an extent. A previous senior design group, and an intern at Vermeer, successively worked on an android application that can control the drone and tell it where to fly and where to take pictures. This is known as the autopilot and waypoint components.
After having examined the application as it is and reflecting upon how we can bring the project to completion, we regathered metrics to get a firmer understanding of what Vermeer wanted. A mapping application that works- quickly, locally, and preferably without access to the internet.
There are still some bugs that remained in the code, and thusly we began work on fixing what we could and adding features that were necessary for the new metrics. This includes a better handling of waypoint generation (the coordinates at which the drone takes pictures) and an elegant way of extracting the pictures from the drone's onboard SD card without digging too deep into DJI’s API.
We later had many professional development meetings to finalize components and their functions, most notably the stitching algorithm, which uses either OpenCV to combine images into a nice panoramic image, or our low memory solution in the event the former handling cannot handle the number or contrast of images.
As the project stands now in the simplest terms possible, the application, with the drone and its controller, can select waypoints for a local area, fly to said area, take images, efficiently and quickly zoom around, take images, send them back to the phone, the phone stitches them, and generate a GIS-compatible file. All in ten minutes or less depending on the number of waypoints.
There is plenty of features, and plenty of room for expansion. We are proud to have worked on this project, and we hope Vermeer and anyone else fortunate enough to try it out enjoys it.