Battlefield Automation: A Luddite's View
AUTHOR Major Steven J. Gaffney, USMC
CSC 1991
SUBJECT AREA - Warfighting
EXECUTIVE SUMMARY
BATTLEFIELD AUTOMATION: A LUDDITE'S VIEW
The complexity of the modern battlefield and the
advances in computer technology are driving warriors to a
greater adoption of automated systems. The success in
Southwest Asia (SWA) of advanced weaponry and the Marine
Corps' acceptance of Maneuver warfare accelerate this trend.
My purpose is to point out some of the vulnerabilities of
this technology which need to be addressed as we move to an
automated battlefield.
The Marine Corps has grown dependant on automation to
meet its administrative and logistics requirements. As
warfare grows increasingly complex, automation is required
to manage the battlefield as well. Command and control,
intelligence analysis, targeting, guidance and coordination
of air assets are all becoming dependant on automation as
well. Maneuver warfare demands automated tools to collect,
filter, and analyze data for timely decision making and
action.
We are not very good at developing these tools. We
lack the skilled professionals required to do the job and we
manage our assets poorly. Our development process is slow
and produces systems that may be flawed or inappropriately
limited. To compound these problems, we have difficulty
determining when we have failed to develop quality systems.
It is only in use that the failings of these systems will
become known.
Once in operation our systems are still vulnerable.
Systems can disclose information to the enemy or, in our
efforts to prevent disclosure, our system use can become
severely limited. Our systems and data can lose their
integrity if we fail to control system access and
procedures. Additionally our systems are vulnerable to
enemy intrusion leading to their subversion or destruction.
The introduction of new technology makes the warrior
vulnerable to new threats. The warrior must accept new
technology, but he does not have to accept it blindly. I
conclude that the warrior must become an active participant
in the introduction automation onto the battlefield. He
must not allow technicians to control his future.
BATTLEFIELD AUTOMATION: A LUDDITE'S VIEW
OUTLINE
Thesis Statement. Though automation promises to enhance
battlefield capabilities, automated systems introduce new
vulnerabilities as well.
I. The Need for Automation
A. Administrative Requirements
B. Command, Control, and Communications (C3)
C. Maneuver Warfare
II. Development Limitations
A. The DoD and Marine Corps Record as a System
Developers
B. Skills Limitations
C. Project Management
D. Lead Time and Development Decisions
III. Development Issues
A. Design Limitations and Flaws
B. System Test and Validation
D. Development Threats
IV. Operations Issues
A. Disclosure
B. Destructive Use
C. Intrusion
BATTLEFIELD AUTOMATION: A LUDDITE'S VIEW
There is no adequate defense, except stupidity,
against the impact of a new idea.
Percy W. Bridgman
The ability to turn data into information is a
significant factor in the effectiveness of military forces
in the 1990's. A military force in the field is continually
collecting and analyzing millions of pieces of data for both
immediate decisions and future requirements. How men and
machines process this data can determine who is the
vanquished and who is the victor. An on-board computer
system determines the Tomahawk cruise missile's ability to
travel hundreds of miles low over the Earth to strike within
a few feet of the target. How a joint commander executes
his operation plan, coordinating the flow of men and
equipment to a site of conflict, is directly influenced by
his manipulation of the Time-Phased Force and Deployment
Data (TPFDD) computer system. Computer systems like these
are central to our ability to make war; if they fail, we
risk failure. Though automation promises to enhance
battlefield capabilities, automated systems introduce new
vulnerabilities as well.
Throughout history, victory has often been determined
by the quality of weapons and tactics before the battle had
even begun, regardless of the quality of the individual
warriors. The French knights were doomed at Agincourt and
the Spanish Armada were fated for defeat because they met
superior weapons and tactics. Likewise, in future wars the
quality of the combatant's automated systems developed in
times of peace may be the deciding factor in battle. The
systems which mold and form the way an army does battle take
years to develop and cannot be modified significantly once
battle is joined. Just as the French Knights at Agincourt
could not drop their armor and pick up bows, the Tomahawk
missile cannot be fitted with a new guidance system on the
eve of battle. Today's army may be doomed to failure if the
systems it goes to war with are not up to the challenge.
The deployment, supply, maintenance, pay,
administration, operations and training of today's forces
are directly supported by automated systems. In the words
of one officer confronting combat in Southwest Asia (SWA),
"We've become so efficient with computers that we can no
longer do anything manually." The officer goes on to
suggest that we "turn the clock back and get rid of a bunch
of systems." The reality is that computer systems are a
necessary part of military logistics and weapons in the
1990's. Eliminating automation will not improve our
warfighting ability any more than would the elimination of
ammunition supplies.
Command, control and communications (C3) in the 1990's
requires automated systems. Thousands of aircraft of
varying nationalities, capabilities and types, using
hundreds of bases to attack thousands of targets over a
period of weeks, cannot be coordinated effectively on-the-
fly. Nor can air forces be coordinated with ground and
naval forces effectively without integrated C3 systems. The
volume of data that must be collected, processed and
distributed in a modern integrated C3 system can only be
accommodated through the extensive use of computers. In
particular, for the Marine Corps to deploy as part of the
U.S. forces, it must tie into the joint C3 systems, and this
means a heavy reliance on automated systems.
The Marine Corps is committed to maneuver warfare as
its concept for warfighting. This form of warfare depends
on a high tempo of operations, and emphasizes flexibility
and bold action in the face of uncertainty. One view of
maneuver warfare, put forward by Colonel John Boyd, U.S.A.F.
Retired, is based on the theory that combatants go through
an iterative mental process of observation, orientation,
decision and action called an OODA loop. By completing this
loop faster than our opponent or by disrupting his OODA
loop, we can defeat him. As the battlefield becomes
increasingly complicated, the idea of competing OODA loops
becomes increasingly relevant. Our ability to collect and
forward data to the commander has never been greater;
burying him in data and locking him in the observation
phase. [25: 84) Automated systems have the potential for
correcting this problem by quickly sorting through this
mountain of data and providing information for decision and
action. The quality and abilities of our systems can make
the difference between the maneuver war winner and the
maneuver war loser.
Given the importance of automated systems, DoD devotes
an estimated 10% of its budget on software. The Marine
Corps spends over $200 million annually to develop, maintain
and operate non-embedded computer systems. [13: B-3] DoD
and the Marine Corps may not be getting their money's worth.
Whereas the useful life of an automated system is considered
to be five to eight years, the Marine Corps' major automated
systems are all over twenty years old. The replacement for
the Marine Corps supply system was recently redefined and
declared complete after it approached twelve years and $100
million in development costs. [11: 5-12] The Real-Time
Finance and Manpower Management Information System (REAL
FAMMIS) was declared complete after it was learned that
development would cost another $125 million to meet all of
its known objectives [11: 4-5]. The Marine Corps has
problems fielding automated systems, but it is not unique in
DoD. The Navy cancelled an accounting project after nine
years and $230 million in costs. The Army has little to
show for the nearly $80 million spent on the Advanced Field
Artillery Tactical Data System (AFATDS) over a period of six
years. [22]
The most obvious problem with the development of
automation is the shortage of technically qualified
personnel. The Marine Corps has no formal process for
selecting and training project team members. The criteria
used to select project personnel may be rational such as
functional or data processing experience, or random such as
tour rotation availability. The people selected will
probably have no experience or training in the development
of automated systems. The only source for adequately
trained personnel is the Naval Postgraduate School Computer
Systems Management masters program, and these people are
assigned to 59 specific billets, few of which are associated
with automated system projects. [10: 5] The project team
typically consists of people with little or no understanding
of the assigned job and great dread of certain failure.
The problem of unskilled project teams result in the
second major development problem: contract management. To
compensate for the lack of technically qualified team
members, contractors are hired. The contractor promises
technical competence and project completion for a reasonable
fee. The project team must supervise the contractor, but
since they often don't understand the technology or contract
management principles, they cannot supervise effectively.
With or without proper guidance the contractor will continue
to work and to be paid until the project team determines
they are not producing. This has proven to be costly for
the Marine Corps both in terms of time and money. Of seven
contracts used to develop the Standard Accounting, Budgeting
and Reporting System through 1987, four were terminated
because they failed to support the development. These
contracts cost the Marine Corps over $6.7 million in direct
costs and directly led to one project restart. [11: 3-35]
The two preceding problems would probably not exist
except for the biggest problem, that systems are being
developed by amateurs. There is a paradox associated with
automated systems: most senior leaders feel intimidated by
the complexities of computer systems yet feel totally
qualified to sponsor their development. Functional managers
develop their own automated systems, e.g. the Deputy Chief
of Staff, for Installations and Logistics sponsors logistics
systems, the Fiscal Director sponsors financial systems, and
the Commanding General, Marine Corps Combat Development
Command sponsors Fleet Marine Forces systems. The Marine
Corps' expert on systems, the Director, Command, Control,
Communications and Computer (C4) Division is rarely called
to help. System sponsors prefer to "go it alone" with their
people free from outside "interference." Projects are
jealously guarded from outside scrutiny, but since sponsors
lack the desire to get involved with technical issues,
progress is seldom reviewed internally either. In
protecting their projects, sponsors ignore Marine Corps
oversight requirements. In March of 1990, only five of the
twenty-nine automated information systems projects requiring
the approval of the Assistant Commandant of The Marine Corps
(ACMC) had been approved. [9: 4] Given the technical
requirements of these projects, most have as much chance of
success as a heart transplant performed by a plumber.
For those systems which are successfully developed, the
road is often slow and torturous. Development projects
lasting over ten years are not unusual. In addition to the
inherent difficulty of project development, early errors,
assumptions or decisions must be corrected or revised as the
project progresses. This results in costly and time
consuming changes to the system and even project "restarts".
While changes late in the project are expensive, project
restarts are extremely painful reversals, often accompanied
by personnel shake-ups and professional embarrassments.
Because of the severity of the impact of system changes,
often it is determined not to make important changes. The
political and economic costs of correcting deficiencies may
be perceived to be greater than the cost of the
deficiencies. Additionally, making changes in the design
must at some point be frozen or the system would never be
fielded! The result though is that the long awaited product
may be based on an obsolete design, developed with invalid
or sub-optimum decisions and assumptions. In our rapidly
changing warfare environment the results may be as tragic as
the Polish lancers of 1939 in their attacks on German Tanks.
Systems are only as good as their design. Designers
must create systems to operate in an environment
characterized by friction, uncertainty, fluidity and
disorder [25: 4-10]. To develop a system which is not
constrained in some way by this reality would be impossible.
Today's technology is incapable of producing a system which
can perform all functions in all possible battlefields.
Presented with the infinite variables of the battlefield,
the designers focus on the most important functions and most
likely problems. They assume away all which does not appear
to effect the operation of their system. They predict the
nature of future battlefields to determine the threat and
the warrior's requirements. This cannot be performed
to perfection; to overstate the obvious, the future is a
moving target! With long development cycles, unpredictable
threats and unknown requirements, it is often as much
through luck as through design when we field a system which
meets real world needs. In the case of the USS Vincennes
shoot-down of Iran Air Flight 655, a minor design omission
proved costly, "Had the Aegis system displayed altitude data
on the target display, the captain might have seen that the
aircraft was not descending... Ships carrying the Aegis
system are now being ref itted to correct this deficiency."
[24: 15)
Even if we were not limited in our ability to design
systems, we are limited in our ability to turn that design
into a "perfect" system. Often what we intend to do is not
what is done. Murphy's law never rests. One need only
watch the U.S. space program to know that no matter how much
money and attention is spent, sophisticated systems often
fail to function as intended. A seemingly trivial error can
be costly. For example, though it is well known that
January 1st in the year 2000 will arrive, uncounted software
programs have been created which cannot handle the once
every 100 year century change. Many users of this software
will remain totally unaware of this defect until they wake
to find that their systems don't work correctly. This might
be costly, but consider the impact of a tactical system
which confronts combat situations which don't fit its design
parameters.
Given the potential for disaster, system validation is
extremely important to catch design limitations or flaws.
This could be accomplished by reviewing the design, its
logic and the basic components of the system. In the case
of software this would consist of a documentation review of
both the system analysis and resulting code. While there
are standards and requirements for documentation, systems
are often poorly documented at the level required for system
validation. When the Marine Corps' three major automated
systems projects were reviewed in 1987, none had a clear and
complete statement of what they were to accomplish or what
benefits they would deliver for their $300 million plus
cost. Documentation developed below that level was
correspondingly superficial, inconsistent, or absent. [11]
With documentation, following the logic of another's
work through thousands of dry pages of impenetrable
documentation is a time consuming and thankless task. The
results are often inconclusive. The best way to validate a
system would be to test it under conditions simulating the
real environment. To adequately test a system of nominal
complexity is difficult; to test it completely is
impossible. Exhaustive testing consists of testing the
system with all possible combinations of variables in all
possible environments. Without exhaustive testing, no
testing can be considered entirely without risk. By
selecting test data and conditions the system can be judged
to be sound, but cannot be certified as being 100% without
flaw. To illustrate, one software development expert gives
an example where:
The engineering calculations... were so
complicated that ... it would have taken one
hundred engineers five years to check the system's
results using desk calculators. So not one of our
team could say for sure that our system was
producing the correct results. All we could say
was that the results looked reasonable. But we
knew in our hearts that although we had tested the
system exhaustively (that is, until we were
exhausted) there could still be some bugs. [17:
30]
At a more basic level, testing and validation as a
concept is as faulty as the design process. The fundamental
problem with validation is that it must test those things
the designers anticipated, and everything else as well. All
too often the same people, or same type of people, do the
validation. They use similar assumptions, focus on similar
issues and, not surprisingly, come to the same conclusions.
Even those items which are reviewed are often given a wide
margin of error. With the strong pressure to successfully
field a system, there is great incentive to modify or
structure the process to allow success or to even omit
testing and validation. For many tested and validated
systems, the first real tests come with fielding.
Without a complete understanding of our systems we are
vulnerable to surprises and are at the mercy of the
capabilities of the system designers. We cannot ensure the
desired system performance, nor can we ensure that they
don't perform as our enemy desires. It is extremely
difficult to prevent oversights and errors in our designs,
but it may also be impossible to prevent the purposeful
attack on our systems. Programmers commonly construct a
secret "backdoor" in a program to allow them access and
special privileges, unknown to the system owner. One
software company even went so far as to use a backdoor to
sabotage the Revlon Corporation when it withheld payments
for services. [20] The effect of such an action in a
military system could be disastrous. The French developer
of the Iraqi air defense system claimed to have rendered the
system useless with a computer "virus" which they activated
after the invasion of Kuwait. [7] If this is true, the
astounding success of the Desert Storm air war seems far
more understandable. If we don't understand our software
sufficiently for testing or validation, guaranteeing that it
contains no threat from the inside is impossible.
If we tighten validation and testing methods and
closely supervise our development, we may be able to prevent
sabotage. But can we do a better job of monitoring system
development? Sophisticated systems such as the Strategic
Defense Initiative require our brightest minds. Who can
watch over them - our second brightest? Because we cannot
train or hold individuals with the requisite skills, most
DoD software projects are contracted. As previously
discussed, we do not manage contractors well. The people we
use to monitor these projects are the same ones who were
insufficient in quantity and quality to do the job
themselves. This is not meant to imply that contractors
would be less patriotic or diligent than U.S. Government
employees, but they are also limited in the number of bright
minds. Without bright individuals who understand the
technology being used we cannot prevent intentional sabotage
of our systems as they are designed and developed.
In addition to talented individuals, sophisticated
technology is needed to understand the operations of the
semiconductors that serve as the foundation of virtually all
of our systems. While the U.S. pioneered semiconductors -
the Japanese now controls 52% of this market. [8] As we
grow in our dependency to the embedded power of
semiconductors - we grow dependent on the Japanese.
Semiconductor microchips of the past contained relatively
simple circuits with only a few transistors on a chip.
Today's microchip can contain millions of transistors in
circuits more elaborate than the Los Angeles road network.
With the loss of our semiconductor industry, we lose the
ability to understand this underlying technology, thus
losing the security of our systems at the most basic level.
Worse yet, another nation has a monopoly on an essential
element to our national defense. It should be ominous to
all that during the war in SWA the U.S. required the aid of
the Japanese embassy to obtain parts needed in the system
used to analyze real-time intelligence data. [1]
A system which is fielded successfully without major
flaws and limitations is still vulnerable. In fact, the
majority of DoD's system security concerns have been with
systems in operation. These efforts, focused on protecting
data and systems, appear to be successful. But how do we
judge success? We only know of our enemy's efforts when
they are caught. With an intelligent enemy we may not know
if our data and systems have been exploited until it serves
their purpose - if then.
In the book "The Cuckoo's Egg", an astronomer working
as a systems manager at the Berkeley Lab, recounts how a
seventy-five cent accounting error lead him on a year long
hunt to capture a KGB sponsored computer spy. Using
relatively simple techniques, the "hacker" turned spy
penetrated computers throughout the U.S. with a special
focus on defense labs and military bases. He moved
undetected as, in most cases, he used the security measures
themselves to exploit the systems. [23]
The defense environment generally views disclosure as
the most serious threat. [3: 423] Sensitive data and
information may be disclosed through the leaking of
products, communications leaks, through data links, direct
access to the system, or even the emissions of the hardware.
Physical isolation and limiting access in conjunction to
administrative restrictions are commonly used to reduce this
risk. In their most effective form, these methods protect
our systems so well as to render them useless to the
intended user as well as to the enemy! Advanced avionics
are worthless if they are not in an airplane, intelligence
efforts are pointless if not supporting decision making, and
a network isn't a network if it doesn't connect people. A
system's potential contributions have to be weighed against
its potential for compromise and its use adjusted
accordingly. When we miss the proper balance, some systems
do not live up to their promise while others are more
valuable to the enemy than to us.
In addition to protecting systems from unintended
disclosures, they must be protected from destructive use.
Without adequate protective measures, through design and
administration, we may corrupt our own systems. By making
uncoordinated system changes, we create an electronic Tower
of Babel which allows neither interoperability nor effective
management. When we lose control of our data through
uncontrolled access and editing, we become like the man with
two watches - he never knows the correct time. If our
Systems and data are destroyed by the user, they are just as
worthless as those destroyed by the enemy.
In operation, our systems are vulnerable to outside
attacks as well. Intruders can plant a "virus" or other
destructive program which attacks or subverts our system. A
virus spreads throughout the system and can spread to other
systems. Once inside, the virus may act in many different
ways. A "worm" may bore through our files destroying data
and programs. A "Trojan horse" remains undetected as it
modifies the system so that the sender can attack or gain
control of the system later. Nuisance viruses may only
flash messages on a screen making the system unusable. In
the spring of 1990 a nuisance virus spread throughout the II
Marine Expeditionary Force (MEF) after being introduced by a
Marine returning from overseas. Maintenance, supply, and
personnel administration functions were frozen for several
days until the virus could be purged from all II MEF
microcomputers. The potential for use of these tactics
against our systems is unlimited. The Army is currently
reviewing the possibility that viruses can be transmitted by
radio, "hampering severely their ability to communicate,
shutting down radars, fouling computerized aircraft control
systems and confusing intelligence-gathering systems." [14]
Flush with victory in SWA, the press, public and even
the military are claiming that this is the dawn of a new era
in warfare. They see technology as possessing boundless
capabilities to detect, target and destroy our enemies.
While the current technology promises to make a tremendous
impact in warfare, history has shown that any technological
advance is eventually countered. With today's rapid pace of
technological change any advantage will be countered or
bypassed rather quickly. The warrior's enthusiasm for
technological wizardry must be tempered with a heathy dose
of realism.
Reliance on technology introduces the warrior to new
vulnerabilities. Systems cannot be produced instantaneously
to meet emerging requirements, their development requires
many years and large resource and political investments.
Even successfully fielded products may fail to meet our
needs. The ability to develop quality systems is limited by
our divination of the future, our managerial abilities and
the quality of our workmen. Even our best minds and efforts
cannot field war winning systems if we lose critical skills
and industries to foreign competition. Finally, automated
systems are just as vulnerable to enemy targeting as any
other weapon or tactic.
We cannot afford to be twentieth century luddites,
condemning technology and hoping it goes away. The
advantages of the automated battlefield are too great for us
to back away from the future, but neither should we be
lemmings. It would be foolish to blindly accept new
technology without understanding its shortcomings and
vulnerabilities. The potential cost in lives and mission
failure is too great for warfighters to be lead passively
into the adoption of a high technology Maginot Line. To
paraphrase Clemenceau: warfighting is much too important a
matter to be left to the technicians. We must seriously
review how automation fits in the military context and
become active and informed participants in planning for its
employment.
BIBLIOGRAPHY
1. Auerbach, Stuart. "U.S. Relied on Foreign-Made Parts For
Weapons." The Washington Post, March 25, 1991.
2. Amoroso, Edward, and Watson, John. "A Trusted Software
Development Methodology." Proceedings of the 13th
National Computer Security Conference, Volume II.
Washington, D.C.: National Institute of Standards and
Technology / National Computer Security Center, 1990.
3. Baker, William C., and Pfleeger, Charles P. "Civil and
Military Application of Trusted Systems Criteria."
Proceedings of the 13th National Computer Security
Conference, Volume II. Washington, D.C.: National
Institute of Standards and Technology / National
Computer Security Center, 1990.
4. Boehm, Barry W. Software Engineering Economics.
New Jersey: Prentice-Hall, 1981.
5. Davis, Fred. "Could the Repo man Grab Your Invaluable
Software?" PC Week, November 12, 1990.
6. Davis, Fred. "Technology Follows Biology as Computer
Viruses Proliferate." PC Week, January 21, 1991.
7. Drozdiak, William. "Embargo Said to Make U.S. Attack Less
Difficult." The Washington Post, November 17, 1990.
8. Gergen, David. "America as a Techno-Colony." U.S. News &
World Report, April 1, 1991.
9. Huddleston, Major Craig S. "Commentary on Desert Shield."
Marine Corps Gazette. January, 1991.
10. Information Systems Management Branch, C4I2 Department.
Staff Study. Development of Automated Information
Systems. HQMC, 1989.
11. Information Systems Management Branch, C4 Division.
Audit Report. Major Systems Review. HQMC, 1988.
12. Lavine, Arnold S. "GSA Rates Marines' IRM Highly."
Federal Computer Week, September 28, 1987.
13. Headquarters of the Marine Corps. "Mid Range Information
Systems Plan (FY90-FY97)." Marine Corps Bulletin
5271, September, 1990.
14. Munro, Neil. "Army to Test Threat of Computer Viruses."
Defense News, June 1, 1990.
15. Munro, Neil. "DoD, Experts Split Over Use of Software
Viruses to Control Exports." Defense News,
December 10, 1990.
16. National Computer Security Center. A Guide to
Understanding Audit in Trusted Systems, Fort George
Gordon Meade, June 1988.
17. Page-Jones, Meilir. The Practical Guide to Structured
Systems Design. New York: Yourdon Press, 1980.
18. Pierce, Charles R. "Experiences in Acquiring and
Developing Secure Communications-Computer Systems."
Proceedings of the 13th National Computer Security
Conference, Volume II. Washington, D.C.: National
Institute of Standards and Technology / National
Computer Security Center, 1990.
19. Richards, Evelyn. "Army Scouting to Enlist Aid of
Computer Virus." The Washington Post, May 23, 1990.
20. Richards, Evelyn. "Revlon Suit Revives the Issue of
'Sabotage' by Software Firms." The Washington Post,
October 27, 1990.
21. Richards, Evelyn, and Suplee, Curt. "Computers
Vulnerable Panel Warns." The Washington Post,
December 6, 1990.
22. Richards, Evelyn. "Pentagon Finds High-Tech Projects
hard To Manage." The Washington Post, December 11,
1990.
23. Stoll, Clifford. The Cuckoo's Egg. New York: Simon &
Schuster, 1989.
24. Subcommittee on Investigations and Oversight. Staff
Study. Bugs in the Program: Problems in Federal
Government Computer Software Development and
Regulation, April, 1990.
25. U.S. Marine Corps. Warfighting, FMFM 1. HQMC,
March 1989.
26. Van Creveld. Command in War. Cambridge, Mass.: 1985.
27. Yourdon, Edward. Managing the structured Techniques:
Strategies for Software Development in the 1990's.
New York: Yourdon Press, 1976.
NEWSLETTER
|
Join the GlobalSecurity.org mailing list
|
|