UNITED24 - Make a charitable donation in support of Ukraine!


Autonomous Real-time Ground Ubiquitous Surveillance - Imaging System / ARGUS-IS

Argus, son of Aristor, had a hundred eyes, fifty of which were always open, the several pairs doing so in succession [some mythologies state that one-half of his eyes slept at the same time]. Juno made choice of him to guard Io, whom Jupiter had transformed into a white heifer. Shepherds had to watch with incessant and sleepless care over the labors of primitive husbandry. This ever-watchful superintendence is typified by Argus with his countless eyes. Jupiter, pitying Io for being so closely watched, sent Mercury, under the disguise of a shepherd, who with his flute charmed Argus to sleep, sealed up his eyes with his caduceus, and then cut off his head. Juno, grieved at the death of Argus, turned him into a peacock, and the eyes of Argus were transferred by Juno to the plumage of her favorite bird, the peacock. Argus was deemed a highly appropriate name to give to a vigilant watch-dog. Argus-eyed, is very observant; allowing little that is cognizable by a momentary glance of the eye to escape one's notice. The term is used as a synonym for universal and ceaseless watchfulness.

The mission of the Autonomous Real-time Ground Ubiquitous Surveillance - Imaging System (ARGUS-IS) program is to provide military users a flexible and responsive capability to find, track and monitor events and activities of interest on a continuous basis in areas of interest. The overall objective is to increase situational awareness and understanding enabling an ability to find and fix critical events in a large area in enough time to influence events. ARGUS - IS provides military users an "eyes-on" persistent wide area surveillance capability to support tactical users in a dynamic battlespace or urban environment.

The Autonomous Real-time Ground Ubiquitous Surveillance - Imaging System, or ARGUS-IS, is a next-generation airborne capability, providing wide-area, high resolution, color video imaging that enables persistent surveillance of dynamic battle spaces and urban environments. The system consists of three elements: A 1.8 billion pixel video sensor that runs at video frame rates to support tracking of both ground vehicles and dismounted targets. When videos representative of Constant Hawk and Argus-IS capability are compared side by side, the benefitis clear; even amateurs can more quickly and unambiguously identify dismounts. And the easing of this analysis task for humans is mirrored in the easing of the analysis task for the computer. The reliability of automatic tracking algorithms improves, thus enabling a first step toward relieving the pressure felt by the data volume.

The USAF and DARPA conducted a final series of test flights in the summer of 2010 as part of the transition of ARGUS-IS to the Wide Area Airborne Surveillance (WAAS) program. ARGUS-IS was being evaluated for readiness for inclusion in funded Quick Reaction Capability (QRC) programs.

The objective of the ARGUS-IR program is to develop an IR system that provides a real-time, high-resolution, wide area video persistent surveillance capability that allows joint forces to keep critical areas of interest under constant surveillance with a high degree of target location accuracy. ARGUS-IR will provide a >16-130 "Predator-like" steerable video streams to enable real-time tracking and monitoring and enhanced situational awareness during evening hours. The system will also provide for a continuous update of the entire field of view by allocating a percentage of data link bandwidth to this function.

There are two principal IR bands that could be considered for the IRSS: long wave infrared (LWIR) and mid wave infrared (MWIR). For the purpose of this discussion, the wavelengths associated with LWIR will be 8-12 and the wavelengths associated with MWIR will be 3-5. Both of these IR bands are "emissive" bands, i.e., they are bands where objects produce light in the above mentioned bands based on their material composition and temperature. The IRSS will provide relative measurements of the temperature of the area under observation. If the IRSS has a fundamental Noise Equivalent Temperature Difference (NETD) of 0.080 degrees Kelvin (K) and the observed temperature difference of the target versus the background at the sensor is 0.080 degrees K (or 80 mK), then the observed signal to noise ratio (SNR) will be 1. For example, a low IR observable NATO tank (i.e., a tank with a temperature difference of 2 degrees K versus the background and 6 kilometers of atmosphere that provides for 80% transmission per km) will have an observable temperature difference at the sensor of 0.325 degrees K. This temperature difference translates into an SNR of 4.0625 or an SNR of 6.1 dB. SNRs of > 6 dB will assist in automation of the image processing associated with ARGUS-IR.

The conceptual IRSS utilizes a CFPA. The CFPA is then packaged with a telescope and mounted in a gimbal. The gimbal mechanism consists of all of the elements used to keep the IR sensor optics stabilized during the effective integration time associated with the formation of the IR images. The telescope consists of the entire set of optical elements, housing structure for the optical elements and the mechanism for keeping the imagery within the depth of field of focus associated with the optical subsystem. If a CFPA is composed of multiple rows of IR focal plane arrays where the IR focal plane arrays within a row are butted together, there will be a small vertical gap (3-5 pixels) between adjacent IR focal plane arrays and large horizontal gaps between rows of IR focal plane arrays. The vertical gaps are made small by the construction of IR FPAs that are two-side-buttable. The large horizontal gaps are a result of the area associated with the IR FPA read-out integrated circuit (ROIC) and the I/O bonds that electrically connect the IR focal plane arrays to the substrate material on which they are mounted.

Conceptually, an IR Sensor for the ARGUS-IR can be constructed from a single large CFPA and single telescope utilizing existing LWIR microbolometer-based FPAs with 17 pixels. A common IR FPA array size for existing 17 microbolometer-based IR FPAs is 1024x768 pixels. If such an array was constructed to be two side-buttable, the active imaging area of the IR FPA would be approximately 17.4 mm by 13 mm. The vertical distance between active areas of two horizontal rows of FPAs is likely to be between 10 and 15 mm. With this type of CFPA construction, two images would need to be taken and the images mosaiced together to image the entire field of view.

The telescope is to be designed and built for the final pixel size FPAs. This means that the telescope needs to be designed with a target IR FPA pixel of 12 versus the 17 pixel that would be utilized for the CFPA version that will be built in Phase 2 of the program. In general, it is expected that the focal length for the telescope, constructed in the above fashion, would be between 300 and 360 mm. From an operational system perspective, a slightly longer focal length is preferred to a shorter focal length. However, the physical size of the sensor, i.e., its total length, is a major consideration for mounting on a UAS. The desire to minimize the overall size of the telescope coupled with the desire for a longer focal length results in a difficult set of design trade-offs that must be made. In addition, regardless of the focal length, minimizing the physical size of the telescope is a significant design challenge.

In order to minimize the variation of light reaching the IR FPAs located at nadir versus those located at the perimeter of optics, a telecentric optical design is desired. The key parameter associated with this aspect of the optical subsystem is the variation of loss of light between the pixels on the perimeter of the CFPA and those in the center of the CFPA.

The technical emphasis of the program is on the development of the three subsystems; a 1.8 Gigapixels video sensor, an airborne processing subsystem, and a ground processing subsystem; that will be integrated together to from ARGUS-IS. The 1.8 Gigapixel video sensor produces more than 27 Gigapixels per second running at a frame rate of 15 Hz. The airborne processing subsystem is modular and scalable providing more than 10 TeraOPS of processing. The Gigapixel sensor subsystem and airborne processing subsystem will be integrated into the A-160 Hummingbird, an unmanned air vehicle for flight testing and demonstrations. The ground processing subsystem will record and display information down linked from the airborne subsystem.

A video window, for the purpose of ARGUS-IR, is a set of pixels that can be updated at a minimum rate of 5 frames per second. Each frame in the video window sequence will contain the same number of pixels. Different video windows may be of different size, i.e., have a different number of pixels in their respective frames. A user may request lower frame rates be transmitted to the ground for specific video windows.

There are two fundamental types of video windows: tracking video windows and fixed video windows. A tracking video window is a video window, centered on a moving object (e.g., dismount, vehicle, boat) which moves with the object of interest to ensure that the object is contained within the tracking video window. A fixed video window is a video window that is continuously imaging the same geographical area.

The first application that will be embedded into the airborne processing subsystem is a video window capability. In this application, users from the ground will be able to select a minimum of 65 independent video windows throughout the field of view. The video windows running at the sensor frame rates will be down linked to the ground in real-time. Video windows can be utilized to automatically track multiple targets as well a providing improved situational awareness. A second application is to provide a real-time moving target indicator for vehicles throughout the entire field of view in real-time.

The imagery will contain a full set of metadata. The offeror should indicate what metadata will be sent with each frame of a video window sequence. The metadata should, to the largest extent possible, address metadata standards developed by groups such as the Motion Imagery Standards Board (MISB).

In February 2007 the Defense Advanced Research Projects Agency's (DARPA) Information Exploitation Office (IXO) solicited proposals for the Autonomous Real-time Ground Ubiquitous Surveillance-Imaging System (ARGUS-IS) program under Broad Agency Announcement (BAA) BAA07-23. ARGUS-IS will advance technologies and systems that will enable wide area persistent surveillance thereby provide greatly enhanced situational awareness to the warfighter. These technologies and systems will be transitioned to various partners and customers. Participants will work closely with the transition partners to aid in this process.

ARGUS-IS will be composed of three phases, the first phase is a design and component build phase, followed by a video window and moving target indicator phases. At the discretion of the Government, the initiation of later phases (i.e., Phases 2 and 3) is contingent upon the availability of funding and the Government's determination that system-level performance goals established for earlier phases were met.

- Phase I - Design and component build: During Phase 1, the design of the overall system and subsystems will be performed. In addition, critical elements of the subsystems may be built. This includes both critical hardware elements as well as software elements.

- Phase 2- Video Window Functionality: During Phase 2, the ARGUS-IS Subsystem build will be completed, the subsystems integrated together and the Video Window functionality will be demonstrated in a series of flight experiments. It is expected that the first flight experiment will happen midway through this phase of the program. At the end of this phase of the program, this functionality may be transitioned by DARPA to interested research, industrial, and operational military communities.

- Phase 3- Moving Target Indicator: During Phase 3, the MTI functionality will be integrated into the operational software and flight tests of this capability will take place.

Join the GlobalSecurity.org mailing list

Page last modified: 28-07-2011 00:47:58 ZULU