Menu

Processing the Flood of Data

The job of the intelligence analyst has become tougher over the last few years. They have been bombarded by a growing deluge of data from an exploding constellation of sensors. They are often trying to anticipate unknown actions of an unknown adversary from a series of observations that are not necessarily neatly organized according to time and place.

Intelligence analysts operate in a fast-moving and ever-shifting landscape in which it is becoming increasingly difficult to proactively uncover challenges and opportunities that are coming down the pike. That is why the effort to automate many intelligence functions has come to the fore in recent years. Allowing machines to correlate, analyze and fuse raw data creates time for analysts to do what machines cannot: Apply human judgment and experience to intelligence problems.

Automating intelligence is another version of the classic big data problem. Technology advancements have enabled the processing of ever-larger data sets. Technology can narrow down the scenarios and options from which analysts, commanders and warfighters can choose to deal. At the same time, the robust processing power that is brought to bear allows the consideration of a broader universe of data to begin with.

“More and more sensors [have] been put to use in Afghanistan, which has generated more and more data at the forward edges of the battlefield,” said Colonel Charles Wells, project manager for the Distributed Common Ground Sensor-Army (DCGS-A). “Dealing with massive amounts of data requires automated tools to make sense of that data.” DCGS-A is one of a family of programs with common elements, being developed and deployed separately by each of the armed services. DCGS is designed to provide an interoperable architecture for the collection, processing, exploitation, dissemination and archiving of all forms of intelligence.

“New sources of intelligence, such as small, unit-controlled UAVs and other types of tactically emplaced smart sensors, provide tactical units with information they have never had access to until now,” said Jeff DeTroye, vice president for special programs at Analytical Graphics Inc.

At the same time that sensors have proliferated and the volume of data has exploded, the analyst workforce has remained stagnant. “Increases in the analytical workforce have not matched sensor output,” said John Beck, business development manager at Lockheed Martin Information Systems & Global Solutions. “This has led to the deployment of some form of automation with varying degrees of success.” Lockheed Martin is a key DoD contractor for the DCGS Information Backbone, or DIB. The DIB ensures data discovery and interoperability across the family of DCGS systems.

“We are drowning in data and information derived from a plethora of sources, means and methods, from the traditional forms of intelligence to publicly available arenas such as social media. Combined, they are rich in complex content, including narrative, images and metadata. In order for rapid sense making and analysis to take place, we need tools that reduce the cognitive burden on users and that quickly sift through the noise to find, correlate and auto-associate in ways that reduce search and processing time, streamline discovery, and ultimately give the analyst time to do more analysis,” said Ken Campbell, vice president for national security solutions at DigitalGlobe.

“Automated intelligence tools come to answer the age-old question ‘What or who is on the other side of the hill?’ without exposing your unit,” said DeTroye. “These new tactically controlled sources of intelligence provide the opportunity to keep targets under persistent surveillance in the hours or days before a strike to ensure the high-value target is present or to work out the pattern of life.”

The introduction of automation has changed how intelligence processes work. “In the past, intelligence processes were labor-intensive, manual efforts to sift through the sea of data,” said Campbell.

“The role of human judgment has been eliminated from some of the very manual, tedious and laborious processes,” added Wells. “This allows analysts to focus on intelligence problems and challenges that are the most vexing,” noted Jon Armstrong, director for business development BAE Systems’ intelligence and security sector. “Analysts can make use of pattern-of-life modeling and figure out where to focus their efforts.”

New intelligence processes have significantly changed how missions are organized and approached. What was once a more formal and linear process has given way to a more circular, dynamic and fruitful one.

“In the past, analysts were constrained to collect, process, exploit [and] disseminate data and intelligence in that order,” said Armstrong. “Now they can make decisions earlier in the process. They can change the tasking of data collection based on what has already been discovered in the exploitation space.”

The Army’s requirements for automated intelligence tools revolve around providing meaningful answers to commanders in the field. “We need a powerful system that can connect to all forms of data and to other analysts,” said Wells. “We also need a system that is easy to use. We want analysts to get on DCGS and become proficient very quickly.”

DGCS is in the process of bringing this to fruition by developing a common set of hardware and software and a standards-based architecture. “The standards allow developers of applications to easily integrate into the DCGS environment,” said Beck. “That is how DCGS is able to work with dozens of industry partners and draw from their efforts.”

The same approach will allow DCGS to adapt to future modes of warfare. “We have been focused in recent years on counterinsurgency in Afghanistan,” said Wells. “We now realize we also have to support mid- to high-intensity conflict that may come about in the future. In Afghanistan the biggest challenge was to roll up IED networks. In the Pacific, the challenge may be the ability to detect and analyze electronic intelligence. The architecture we have developed will allow us to incorporate capabilities in these new areas as required.”

Besides the promulgation of standards, several technology developments have also facilitated the development of intelligence automation tools. “Tactically rugged processors and low-power communications allow forces in the field to have automated 
capabilities at their fingertips; that was undreamed of just 10 years ago,” said DeTroye.

High performance computing solutions that can provision and process massive data sets have also been a major facilitator. “This includes cloud-based computing in which data is stored and processed across nodes or clusters of virtualized machines, or in which databases, data processing and applications are combined in memory,” said Campbell. “These solutions allow us to bring all the data and tools into one computing ecosystem accessible via desktop and mobile computing devices.”

“We are seeing acceleration on the infrastructure side, on the computing, and storage side of the business,” said Matt Fahle, a senior executive for intelligence services at Accenture. “As cloud computing capabilities mature within the DoD and intelligence community, there will also be growth on the application side and mobile applications will begin to emerge as part of that evolution.” DCGS workstations have also evolved to the point where they are powerful enough to handle many forms of intelligence. In the past, specialized workstations were utilized to separately handle video, human intelligence and signals intelligence.

“The analyst can now see all the pieces of the puzzle to get compelling answers for commanders,” said Wells. “DCGS workstations are all connected, which facilitates collaboration among analysts. In the future, we foresee including that same type of functionality on mobile devices.”

On the back end, technology advancements allow analysts to gain insights based on larger universes of data. “In the past, analysts had to build complex queries against limited data sets,” said Wells. “The answers inevitably missed much of the puzzle by virtue of the limitations of what analysts could search through. Now back-end systems can load all the data quickly and search through it all using powerful algorithms to get meaningful answers. They can search through every intelligence report for several years over wide geographical areas and still come back with precise answers.”

BAE Systems has developed tools that allow analysts exposure to all types of data from which to build hypotheses and to continue to revise hypotheses based on new data as well as input from their colleagues. “The analysts can build watch boxes through which they are alerted when new data arrives that is relevant to the hypotheses they are working on,” said Armstrong.

The first step to making this happen is to “democratize the data,” in Armstrong’s words, which means normalizing data and curating metadata so that a wide variety of information collected by and stored in divergent systems can be understood. “The data has to be correlated to provide the analyst with perspective,” said Armstrong. “This often involves overlaying the data geospatially and temporally. Doing the initial correlation in metadata space is helpful in quickly digesting large volumes of data and producing a manageable data set. Analyzing the metadata can develop an understanding of patterns of life and how adversaries and other relevant actors operate.”

BAE takes the approach of integrating data before exploiting it. “Analysts are often not sure what the question is,” said Armstrong. “Sometimes it is best to let the data figure out the question and sometimes the answer arrives before the question is known.” BAE Systems has delivered automated intelligence solutions across the DoD and intelligence community for over three years.

AGI has developed software that can automate many intelligence tasks, particularly in the mission planning area. “These mission planning tools can ensure that a UAV flies a profile that will allow access to the target from the right angles, can ensure line-of-sight through challenging terrain and allow the UAV to be as inconspicuous as possible,” said DeTroye. “Our software can also determine the correct placement of ground sensors to overwatch specific areas. The data from all of the available sensors, both tactical and national systems, can then be fused using AGI visualization techniques to provide the complete picture.” AGI software is in use in many locations in the intelligence community and in most of the combatant commands.

For the past several years, DigitalGlobe has been developing a geospatial toolkit and application development framework for performing analytics across large geographical regions. These systems can use multi-terabyte data sets that exceed the capabilities of desktop geographic information system (GIS) platforms.

“We call it MrGeo, for MapReduce Geo,” said Campbell. “It is designed to leverage both CPU- and GPU-based processing for performing operations that can be distributed across a Hadoop/MapReduce framework.”

Graphic processing units, or GPUs, can handle much higher volumes of processing than the traditional central processing unit (CPU) that powers the typical desktop or server. Hadoop and MapReduce are systems often used to manage big data computing. Hadoop is a distributed file system that splits up and stores of large files across many commodity servers while MapReduce is a software process that breaks jobs up into many small tasks that are run in parallel.

DigitalGlobe’s tool also incorporates an easy user interface that allows the analyst access to pre-processed and provisioned data through applications and widgets designed to process, model and analyze a variety of geospatial operations such as route mobility studies, radio frequency propagation analysis and site selection. “All of these operations can be executed in a near-simultaneous workflow on a country or continental scale,” said Campbell.

A unit that may need to identify the optimal route for off-road vehicle across a mountain region in Afghanistan would find it could take hours or days to solve the problem with desktop GIS software, according to Campbell. This same scenario in MrGeo becomes greatly simplified. Currently, instances for MrGeo are undergoing development and integration with three DoD customers.

Lockheed Martin, besides being a key player in DCGS infrastructure, provides support for multi-intelligence fusion for both the Army and Air Force DCGS programs. “We do multi-int fusion to produce more precise geolocation information,” said Beck. “We have deployed fusion engines to the Army for 20 years.”

Data fusion provides a more precise understanding of what is being viewed in the field. “The fusion process may conclude that there are at least two, or perhaps three entities being looked at,” said Beck. “Human intervention makes the final determination but data fusion improves the quality and speed of the development of intelligence.” Lockheed has also been working on a program called Wisdom, which seeks to extract information from unstructured data.

DCGS-A’s latest version, called Hunte, is now deployed throughout Afghanistan. “We listened to soldiers when we built this latest version,” said Wells. “The soldiers mapped their workflows for us and we changed the software to match that. Many companies, small and large worked with the program and integrated their off-the-shelf capabilities, to make the program more powerful.”

Future intelligence automation capabilities will emphasize mobility, according to Fahle. “I see many mobile apps coming online in the next few years,” he said. “We are currently at the tipping point where mobile devices and tablet computers are becoming capable of handling large blocks of data.”

DigitalGlobe is working on an activity-based intelligence solution called the Movement Intelligence Distributed Analytic Services, or MIDAS. MIDAS enables correlation of data derived from full-motion video, ground movement targeting indication, and wide-area motion imagery against high-resolution imagery and 3-D models for discovering networks associated with events and locations in an area of interest.

“MIDAS is principally designed to address the data-intensive world of unmanned aerial system collections and give those challenged with the responsibility of processing, exploiting and disseminating information from aerial platforms the ability to do so in an accelerated and efficient fashion,” said Campbell.

“In the next 10 years we will see more of the processing of sensor data [that] is currently occurring at dedicated sites, transitioning to the location of the sensor,” said John McFassel, acting chief systems engineer at Program Executive Office Intelligence, Electronic Warfare, and Sensors. “Processing capability will be collocated with the sensor on ground and air platforms. This will be enabled by reductions in the size, weight and power requirements of both the sensors and the consuming applications.”

Processing sensor data forward will have several advantages, according to McFassel. “The main one is that it will reduce the time required to convert raw data into information to aid situational awareness,” he explained. “The current lag time for transmitting sensor output to a distant location for processing and exploitation will be significantly reduced. A secondary effect will be that there will be less demand on the transport layer for bandwidth to move this data. The sheer volume of data being transmitted from advanced sensors is significantly taxing communication systems, which are also challenged by mission command requirements.”

Another emerging intelligence initiative is the networking of multiple sensors. “Platforms [that] had hosted a single type of sensor will now host a suite of sensors of multiple modalities,” said McFassel. “This will facilitate tipping and cueing between the multiple sensors on a vehicle or aircraft as well as between aerial and terrestrial platforms and fixed sites. Automation will be important to take full advantage of this capability in a timely manner.”

DCGS-A is moving toward a cloud computing infrastructure with a pilot program called Intelligence Community Information Technology Enterprise, or IC ITE. “IC ITE has great potential to take some cutting-edge technologies and bring them into the Army,” said Wells. “As analysts in the intelligence community create products, we can bring those down to the Army and vice versa.”

“The promise of the cloud,” said Beck, “is to be able to leverage the power of the enterprise to the edge. Analysts and commander will get results informed by the power of the entire enterprise. They won’t have to haul a lot of gear with them and they can be working from a hole in the ground. The improvements that will brought about when cloud computing is applied to intelligence automation will be very noticeable.” ♦

 

Last modified on Wednesday, 16 April 2014 12:46

Additional Info

  • Issue: 2
  • Volume: 4
back to top