DARPA and the US Army have taken the wraps off ARGUS-IS, a 1.8-gigapixel video surveillance platform that can resolve details as small as six inches from an altitude of 20,000 feet (6km). ARGUS is by far the highest-resolution surveillance platform in the world, and probably the highest-resolution camera in the world, period.
ARGUS, which would be attached to some kind of unmanned UAV (such as the Predator) and flown at an altitude of around 20,000 feet, can observe an area of 25 square kilometers (10sqmi) at any one time. If ARGUS was hovering over New York City, it could observe half of Manhattan. Two ARGUS-equipped drones, and the US could keep an eye on the entirety of Manhattan, 24/7.
It is the definition of “observe” in this case that will blow your mind, though. With an imaging unit that totals 1.8 billion pixels, ARGUS captures video (12 fps) that is detailed enough to pick out birds flying through the sky, or a lost toddler wandering around. These 1.8 gigapixels are provided via 368 smaller sensors, which DARPA/BAE says are just 5-megapixel smartphone camera sensors. These 368 sensors are focused on the ground via four image-stabilized telescopic lenses.
The end result, as you can see in the (awesome) video above, is a mosaic that can be arbitrarily zoomed. In the video, a BAE engineer zooms in from 17,500 feet to show a man standing in a parking lot doing some exercises. A white speck is a bird flying around. You can’t quite make out facial features or license plates (phew), but I wonder if that would be possible if ARGUS was used at a lower altitude (during a riot, say).
ARGUS’s insane resolution is only half of the story, though. It isn’t all that hard to strap a bunch of sensors together, after all. The hard bit, according to the Lawrence Livermore National Laboratory (LLNL), is the processing of all that image data. 1.8 billion pixels, at 12 fps, generates on the order of 600 gigabits per second. This equates to around 6 petabytes — or 6,000 terabytes — of video data per day. From what we can gather, some of the processing is done within ARGUS (or the drone that carries it), but most of the processing is done on the ground, in near-real-time, using a beefy supercomputer. We’re not entirely sure how such massive amounts of data are transmitted wirelessly, unless DARPA is waiting for its 100Gbps wireless tech to come to fruition.
The software, called Persistics after the concept of persistent ISR — intelligence, surveillance, and reconnaissance — is tasked with identifying objects on the ground, and then tracking them indefinitely. As you can see in the video, Persistics draws a colored box around humans, cars, and other objects of interest. These objects are then tracked by the software — and as you can imagine, tracking thousands of moving objects across a 10-square-mile zone is a fairly intensive task. The end user can view up to 65 tracking windows at one time.