Previous Chapter: J.Douglas Beason Global Situational Awareness
Suggested Citation: "ABSTRACT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.

Abstract of Presentation

Global Situational Awareness

Douglas Beason, Los Alamos National Laboratory

Battlefield awareness can sway the outcome of a war. For example, General Schwarzkopf’s “Hail Mary” feint in the Gulf War would not have been possible if the Iraqis had had access to the same overhead imagery that was available to the Alliance forces. Achieving and maintaining complete battlefield awareness made it possible for the United States to dominate both tactically and strategically.

Global situational awareness can extend this advantage to global proportions. It can lift the fog of war by providing war fighters and decision makers capabilities for assessing the state anywhere, at any time—locating, identifying, characterizing, and tracking every combatant (terrorist), facility, and piece of equipment, from engagement to theater ranges, and spanning terrestrial (land/sea/air) through space domains. In the world of asymmetric warfare that counterterrorism so thoroughly stresses, the real-time sensitivity to effects (as opposed to threats from specific, preidentified adversaries) that is offered by global situational awareness will be the deciding factor in achieving a dominating, persistent victory.

The national need for global situational awareness is recognized throughout the highest levels of our government. In the words of Undersecretary of the Air Force and Director of the National Reconnaissance Office Peter Teets, “While the intelligence collection capabilities have been excellent, we need to add persistence to the equation…You’d like to know all the time what’s going on around the face of the globe.”

Global situational awareness is achieved by acquiring, integrating, processing, analyzing, assessing, and exploiting data from a diverse and globally dispersed array of ground, sea, air, and space-based, distributed sensors and human intelligence. This entails intelligently collecting huge (terabyte) volumes of multidimensional and hyperspectral data and text through the use of distributed sensors; processing and fusing the data via sophisticated algorithms running on adaptable computing systems; mining the data through the use of rapid feature-recognition and subtle-change-detection techniques; intelligently exploiting the resulting information to make projections in multiple dimensions; and disseminating the resulting knowledge to decision makers, all in as near a real-time manner as possible.

Suggested Citation: "ABSTRACT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
Page 168
Next Chapter: TRANSCRIPT OF PRESENTATION
Subscribe to Emails from the National Academies
Stay up to date on activities, publications, and events by subscribing to email updates.