could be added to the autonomous
system to expand its capabilities.
“This is a really good success
story,” Linne von Berg said. The
technology is ready now, he added,
“we’re hoping that it does get picked
up and used.”
The Navy also is developing an
automated system for finding and
tracking “high-value individuals,”
or HVIs, called the Semantic
Targeting and All-source Fusion
The system is designed to comb
through huge amounts of data gath-
ered by many types of sensors and
human intelligence agents, and find
and compile information about spe-
cific individuals being sought by
Ideally, “STAFF tells me everything about the HVI
— who he made his last cell phone call to, who his rel-
atives are, where he lives, where he banks,” Hagan
said. “STAFF does that automatically.”
Even though it’s automated, it could take “months or
longer” for the system to gather and analyze enough data
to compile “an HVI target package” for its human oper-
ators, Hagan said. Still, the automated system can do
what humans simply cannot, such as searching through
“all of the cell phone traffic within a city,” he said.
STAFF also can conduct automated searches through
huge volumes of full-motion video, moving target indicator data, human intelligence reports and other data. The
system uses semantic technology and natural language
processing to analyze intelligence data, Hagan said.
Semantic technology enables computers to think more
like humans. Rather than simply digging through data to
find specific words or the image of a particular vehicle,
semantic technology can recognize the meaning of words
and the significance of images in a defined context. In
that way, it can sift through unstructured data “and pull
out essential elements of information,” Hagan said.
Similarly, natural language processing understands
the context in which things are said or written.
“With today’s systems, you can search for information
in just a few ways,” Hagan said. For instance, analysts
can search for all of the data gathered at a certain location
or at a certain time, or they search by key words.
An MQ-8B Fire Scout unmanned air vehicle is craned off the flight deck of the
guided-missile frigate USS Halyburton Aug. 3 at Naval Station Mayport, Fla., as
another waits its turn. Fire Scouts flying off Halyburton helped allied forces pick
targets for NATO bombers in Libya and searched for pirates off Somalia during a
seven-month deployment to the U.S. Fifth and Sixth Fleet areas of responsibility.
track and identify multiple moving targets on the
ground, all without human intervention, said Dale
Linne von Berg, who heads NRL’s applied optics
branch. The goal is to reduce the growing workload on
intelligence analysts, he said.
The system uses two airborne cameras that are managed by an automated system that can recognize and react
to moving objects on the ground. One camera, a wide-area
infrared surveillance sensor, scans the ground for moving
targets. When it spots them, an autonomous cue manager
directs the second camera, a narrow-field-of-view sensor,
to take detailed pictures for target identification.
Depending on the altitude of the aircraft carrying the
cameras, the system can spot and identify moving vehicles
and even individuals on the ground, Linne von Berg said.
“From a technology point of view, it’s revolutionary,” he said.
The camera system substantially reduces the num-
ber of people and the amount of time it takes to iden-
tify targets. In fact, Linne von Berg said, the system is
so smart “it could work all by itself.”
It is capable of spotting and identifying targets, then
issuing orders to strike, all without human intervention
— “assuming that, legally, we could do that,” he said.
For now, that is not planned.
Demonstrations last March focused on land targets,
but the system is equally capable at sea or under the sea,
he said. And synthetic aperture radars, hyperspectral
sensors, signals intelligence collectors and other sensors