Digital forensics traditionally operates on the “one investigator, one system, one case” model. Distributed collection and processing solutions certainly exist, but far too many investigators are still managing cases on a single Windows system. This simply doesn’t scale to meet current and future demands for processing speed, managing greater volumes of data, and increased collaboration.

URSA’s autonomous vehicle forensics platform is built around an extensible framework to address current and future requirements for distributed evidence handling and analysis. URSA’s data classification plugin layer is a key enabler of both extensibility and flexibility. URSA’s classifiers enable our engineers and third parties to easily codify algorithms, statistics, and machine learning models that can be applied to individual or groups of flights, retrospectively and in real time.

URSA is working closely with LE and military partners to detect, investigate, and act on UAV (aka drone) evidence rapidly and efficiently. One common use case is payload delivery – detecting cases where a UAV dropped a payload such as contraband into a prison, a 40mm grenade on allied forces, or drugs onto U.S. soil. Let’s see how one plugin is designed to detect this type of activity.

If you have a downed drone in hand, the ability to detect whether or not the drone dropped something, and, if so, when and where the event occurred would be very useful. Teams responsible for interdicting smugglers could analyze UAV data while still at sea and deploy additional assets to indicated drop zones within minutes of seizing the UAV.

URSA Secure is developing a deep learning model to do just that.

To test this model we flew a DJI Phantom 3 and delivered three payloads, one while stationary and two while flying forward at top speed. Each payload drop was proceeded by a takeoff event and followed by a landing event.

Here is a graph over time of the flight. You are looking at the data from the drone after it had been processed and shaped into the form that is used by the deep learning model.

Motor RPM and power data were combined and fed into the model which checks every timestamp for telltale signs of a payload drop and labels the high probability events.

Here is a graph of the data after being run through the model with the probability of a payload drop occurring.

The yellow and green charts show the telemetry data input and the red chart shows the probability of a payload drop at each point in time. The model predicts three payload drop events.

But were these actually the spots where the payload drops occurred?
We used the Phantom’s left front light to trigger the payload release and turned it on and off using one of the user defined switches on the remote controller. Toggling this switch generates a  ‘custom trigger 1’ event. Using data from the DJI Go application log analysis we can verify that the model acting on data from just the UAV works as expected.

As you can see, the model was spot on detecting the three payload drops. Further, while we currently have the threshold set at 75% it appears that the threshold can be set higher, decreasing the chances of false positives. The architecture allows users to tune plugins to their own data and requirements.

This is a simple yet important example of the power of URSA’s framework.