X
10917 Rate this article:
5.0

Managing FMV with Jagwire and Passive Analytics

Anonym

The rapid growth of unmanned aerial vehicles (UAVs) and payloads has resulted in an ever growing deluge of data that has to be archived, sorted, analyzed, and distributed to consumers across the defense, agriculture, and utility markets. In many cases, especially in the case of full motion video (FMV), a single flight can result in several hours of data that has to be viewed and analyzed. Often only a small fraction of that data is useful for analysis purposes. For larger UAV fleets, with multiple, simultaneous missions, substantial resources are required to perform the analysis. The resources required to analyze these data products increases cost proportionally to the amount of data collected.

For systems that have adopted the use of properly formatted metadata, we can attempt to filter this glut of data by analyzing patterns and attempting to infer some operator intent based on domain knowledge. For example, identifying temporal “pauses” for the sensor center field of view may indicate an area or point of interest for further analysis. Circular patterns in the sensor center field of view could indicate the inspection of a building, object, or structure of significance. Smooth “pans” during the video or “sweeping” motions across the ground can infer a collection aimed at covering an area on the ground.

Jagwire has designed and prototyped algorithms capable of identifying these useful segments of video by analyzing the metadata embedded within the video stream. These “passive analytics” run in real time, during the UAV flight, and identify sub-sections of video that are far more likely to be useful in a more detailed analysis. By dynamically detecting, and setting aside these sub-clips of video, the burden of first-phase analysis can be greatly reduced, allowing the user to focus their analytical and dissemination resources on meeting the challenges of their market space rather than wading through a sea of irrelevant data.