This post is also available in: עברית (Hebrew)
US Army researchers use a drone-based multi-sensor system to enable standoff detection of explosive hazards using machine learning techniques.
Augmented reality (AR) overlays will be used in a US Army research for the detection of roadside explosive hazards, such as improvised explosive devices (IEDs), unexploded ordnance and landmines. Route reconnaissance in support of convoy operations remains a critical function to keep Soldiers safe from such hazards, which continue to threaten operations abroad and continually prove to be an evolving and problematic adversarial tactic. The problem is that there is no single sensor solution that will rule them all in regards to different emplacement scenarios.
The US Army Combat Capabilities Development Command (DEVCOM), Army Research Laboratory (ARL), and other research collaborators were funded by the Defense Threat Reduction Agency, via a program which focuses on a system-of-systems approach to standoff explosive hazard detection.
In Phase I of the program, researchers took 15-months to evaluate mostly high-technology readiness level (TRL) standoff detection technologies against a variety of explosive hazard emplacements. In addition, a lower-TRL standoff detection sensor, which was focused on the detection of explosive hazard triggering devices, was developed and assessed. According to the Army, the Phase I assessment included probability of detection, false alarm rate and other important information that will ultimately lead to a down-selection of sensors based on best performance for Phase II of the program.
The sensors evaluated during Phase I included an airborne synthetic aperture radar, ground vehicular and small unmanned aerial vehicle LIDAR, high-definition electro-optical cameras, long-wave infrared cameras and a non-linear junction detection radar.
Researchers carried a field test in real-world representative terrain over a 7-kilometer test track and included a total of 625 emplacements including a variety of explosive hazards, simulated clutter and calibration targets. They collected data before and after emplacement to simulate a real-world change between sensor passes.
Terabytes of data was collected across the sensor sets which was needed to adequately train artificial intelligence/machine learning (AI/ML) algorithms. The algorithms subsequently performed autonomous automatic target detection for each sensor.
The detection algorithms are able to provide ‘confidence levels’ for each suspected target, which is displayed to a user as an augmented reality overlay. The detection algorithms were executed with various sensor permutations so that performance results could be aggregated and determine the best course of action moving forward into Phase II.
Future research into the technology will enable real-time automatic target detection displayed with an augmented reality engine, according to auganix.org.