Menu

Automating Intelligence

The rapid multiplication of tools for collecting ISR data, the many terabytes of data collected and the importance of that data to both tactical and strategic decisions have put a premium on automating its collection, handling and distribution. The effort to automate intelligence involves military and civilian agencies as well as many private firms.

Today’s distributed common ground system (DCGS), or Sentinel, has limited automation to streamline various functions for data processing, exploitation and dissemination, explained Colonel Michael Shields, chief of the capabilities division at the Air Force ISR Agency. Software applications assist analysts in building a sensor plan and researching historical data. These applications also clue in analysts to specific areas of probable interest, build secondary imagery products for dissemination and move, mark and store data efficiently.

The applications work with the U-2 while research and secondary-image software tools support geospatial sensors on the U-2, RQ-4 Global Hawk, MQ-1 Predator, MQ-9 Reaper and MC-12 Liberty.

Using approved navigation plans, the sensor-planner tool for geospatial intelligence builds a corresponding sensor-collection plan. It focuses on the approved target deck and areas to be imaged at optimal locations along the navigation track. Specifically focused on are view angles, terrain and the distance from the track. The human Air Force DCGS mission/sensor planner can also override automatic results to move targets along the navigation track in order to optimize data collection.

“We are also looking to receive automated mission-planning capabilities through our ongoing cryptologic mission management efforts,” Shields said. “This will aid operators in planning mission routes optimized for signals intelligence [SIGINT] collection.”

The DCGS Workflow application stages reference data or images to analyst workstations based on target number and geo-coordinates. “This reduces the time required by the analyst to pull data from system archives,” Shields said.

When a multispectral image is collected with a spectral signature, software creates a square-shape file, overlays it on the image and signals an analyst to look at that area. For SIGINT, automated tools plug in events of interest that may indicate higher-priority activities worthy of collection. The secondary-image application automatically fills certain fields in the product template using metadata embedded in the image.

DCGS has made much progress in automation, “but we believe there is much more to be accomplished,” Shields said. “We continue to look for applications and tools to deal with [the] ever-increasing amount of data we need to analyze.”

Eight automation technology requirements have been documented for the acquisition program office. Several, for planning, tipping and cueing, have been funded in the last four fiscal years. DCGS works with the Defense Advanced Research Projects Agency, Air Force research labs, Mitre, select universities and other industry partners to identify technologies ready for implementation. It also develops technologies at lower technology readiness levels that show promise.

Shields said DCGS seeks tools that increase accuracy and reliability or reduce exploitation time. Some current automation works only in ideal circumstances—for example, rural areas versus heavily trafficked suburbs or sparsely versus densely forested areas. Some tools fail to produce results or produce too many false positives, wasting analysts’ time. And some tools work only with limited target sets.

Moreover, DCGS remains very interested in automated cross-cueing between sensor data and data types, voice-to-text transcription, speaker/language-dialect identification and automated target recognition.

“The nature of our business means we’ve had to automate things from the very inception of the NRO [National Reconnaissance Office] more than 50 years ago, so we have a tremendous amount of experience in doing that and working with industry partners to provide systems that automate things we do,” explained Mike Hale, director of the NRO Ground Enterprise Directorate. “We work diligently to automate those processes that give us the biggest bang for the buck. These automation priorities include mission management, command-and-control and processing of collected data. Those are areas we’ve focused on successfully in the past and we will continue to focus on in the future.”

Hale said the NRO is ensuring that what is automated is moving away from unique, stovepiped systems toward a broad common-infrastructure approach. Such an approach provides the greatest opportunity to pursue effective systems that meet time and cost requirements.

For the future, NRO is closely monitoring the pace of change in commercial markets, as well as in other federal agencies, including the National Security Agency (NSA) and the National Geospatial-Intelligence Agency (NGA). “We believe the pace of change provides us with opportunities for improvement that we can channel into our baseline systems to maintain or increase our effectiveness while becoming more efficient,” Hale said. NRO works most closely with NSA and NGA but also collaborates with the Air Force, Army, Navy and other defense agencies and intelligence community members. “We view combatant command chiefly as our end-users,” Hale said. “We regularly talk to them and provide them with data, and NGA and NSA provide them with intelligence products.”

While sharing many common concerns with other intelligence agencies, Hale noted that NRO’s unique focus on space means it faces distinctive challenges and opportunities. Private firms are critical to the automation effort.

Sentient Vision Systems has two fielded products, the Kestrel Land and Kestrel Maritime systems for airborne ISR. Chief Technology Officer Tom Loveard, Ph.D., said, “Both systems provide automatic detection capability for electro-optical and infrared [EO/IR] full-motion video [FMV] systems. “Kestrel Land provides a moving target indication capability and can detect and track objects that are very difficult for human operators to see,” Loveard explained. “Kestrel Maritime detects objects on the surface of the ocean, whether moving or stationary.” Kestrel Maritime thus applies to ISR operations for the military, border protection and homeland security, as well as search and rescue operations. “We reliably detect targets down to two-by-two pixels in size, but with proven performance down to a half pixel in certain conditions.”

The Kestrel tools primarily assist with detection, where the location or presence of a search target is not known or the target is difficult to locate and track. These situations require coverage of large areas, often over a long period of time. “This can become extremely taxing and difficult for human operators,” Loveard emphasized.

Kestrel’s ability to automatically detect very small objects allows a wide field of view with a large coverage area, while still maintaining a high probability of detection. “Human eyes just don’t scale so well, particularly over long-duration operations,” Loveard said. “Kestrel watches every pixel, hour after hour, and enables operators to concentrate on detected targets, rather than draining their focus with the base search task.”

Another Kestrel function is assisting image exploitation and data fusion. Kestrel exports each detected target with latitude and longitude coordinates. These coordinates can be used for tracking, management, display on mapping systems or cross-cueing between the EO/IR FMV system and other systems such as radar.

Kestrel systems have been operated on more than 15 airborne platforms, including unmanned aerial systems (UAS) like the RQ-7 Shadow, Predator, RQ-11 Raven, Israeli Aerospace’s Heron and manned aircraft including the P-3 Orion and several helicopters. “The system is designed to scale across the very wide range of air vehicles and sensor systems found across the airborne ISR domain,” Loveard said.

Kestrel has been used in Afghanistan, Iraq and Colombia as well as on maritime patrol missions off the horn of Africa, Libya and Australia. Australian forces have used it with Heron, Shadow and ScanEagle. A number of U.S. and coalition partners have also used the system.

Loveard said Kestrel’s moving target indication can detect small and slow-moving objects with low false-detection rates and argues this capability is a key differentiator. “We are able to detect the kind of targets that human eyes find the most difficult, such as dismounts. This adds great value, compared with operation without Kestrel.” Doing all this in the real world makes Kestrel distinctive, Loveard said. He added that Kestrel Maritime is “completely unique.”

Furthermore, Sentient has extensive experience with Kestrel. “The time and development required to mature auto-detection capabilities from a lab-based prototype to effective and successful deployment in theater is substantial,” Loveard said. “Only Kestrel has proven in-theater results.” Sentient is working on expanding its capabilities from airborne ISR to surface-based operations on ground vehicles, fixed-mounted sensors and ships.

“You cannot automate analysis; that is judgment—it would be like automating a jury,” summarized Patrick Biltgen, senior mission engineer for BAE Systems Intelligence and Security. “We do data conditioning to help analysis. We help with the mundane tasks to make data ready to be analyzed, make it easier to discover and get on the screen.”

Data conditioning can include putting data in standard format. “Everybody says they already have a standard format but no one does,” Biltgen observed. BAE also helps give temporal and geo-reference coordinates to activities and events that may come from many different sources or from sensors that do not necessarily have geo-references. This is necessary since “people need to see data in time and space to make sense of it,” Biltgen explained.

People tend to equate automation with pattern matching and that usually means integrating several sources of information. And most analysts want to see ISR data on a map and in time and see movements to make sense of it. “We filter data, we triage data,” Biltgen said. “We have tools to make it easy to move through large volumes of data.”

For automation platforms, Biltgen said that some argue that relational databases are still the best, while others argue schemas are changing so relational databases will not work. He stateed, “We take a hybrid approach, using each kind of database for what it is best at. You need both to handle big data.”

In the 1990s and early in the new millennium, ISR tools emphasized bringing all data to user desktops as PCs grew more powerful. “Now we want to leverage the technology of web browsers, keep some data on servers and integrate it on servers, rather than bring it all to users,” Biltgen said. “Why bog them down? They only need a small part on the desktop.” Moreover, retaining data on servers lowers storage and communication costs and makes it easier to upgrade software.

The BAE engineer said ISR data analysts will increasingly require statistical and mathematical skills, rather than just cultural skills. “Even if you don’t like math, you will need it.”

The big challenge remains, “How do we really visually understand very large amounts of data?” Biltgen said. “We have so much data it is hard for humans to process it. There is no easy answer or silver bullet.”

And even when analysts have sorted through the immensity of ISR data, “How do you present it to decision makers?” Biltgen asked. “You don’t have hours to present 6,000 PowerPoint slides. You must boil it down. How do you do that convincingly?”

Red Hen Systems provides hardware and software to collect geo-referenced video and photo data in the field and bring this data into desktops and web-based maps to aid decision makers.

Sales exec Hoot Gibson said Red Hen’s easy-to-use digital camera accessories and GPS video-digital recorders collect the video images, along with location information. “Our software lets you process the images and generate multimedia maps that bring vital information to the eyes and fingertips of decision makers,” Gibson explained. “With the click of a button you know where something is and what it looks like. And you can share maps with others over the Internet.”

Specific equipment includes the Blue2CAN, used with a compatible Bluetooth GPS receiver to record the geo-location of every picture taken with a Nikon camera; the high-definition Video Mapping System; the MediaMapper Server, to store and distribute geo-tagged photos and geo-path videos; multimedia mapping for aerial patrols; and the RouteScout that extends the functions of FalconView mission planning.

Gibson argued that Red Hen sets the standard for geo-referencing digital and video images from the field. From this, both analysts and decision makers can easily build interactive maps using digital photographs, video and audio. The tools extend from decision-makers and planners to forward warfighters and field agents. Red Hen thus “creates a more detailed and integrated common operational picture, visualizing the world as you need to see it.” ♦

 

Last modified on Tuesday, 21 January 2014 12:43

Additional Info

  • Issue: 2
  • Volume: 3
back to top