UNITED24 - Make a charitable donation in support of Ukraine!

Military

 DOT&E Director, Operational Test & Evaluation  
FY98 Annual Report
FY98 Annual Report

DIRECTOR'S INTRODUCTION


This introduction focuses on two critical aspects of operational test and operational evaluation: resources and evaluation. For several years we have highlighted in these Annual Reports the declining state of Test and Evaluation (T&E) capability in the Department of Defense (DoD). As the downward trend has continued, our Annual Reports have expressed increasing concern. This year we describe how a decade of downsizing in T&E is beginning to increase costs and cycle times in acquisition programs. This is clearly a wrong direction. T&E, especially operational testing, is a small part of acquisition program costs, and yet programmatic delays due to limited T&E capacity can cause very large costs to a program overall. If we do not invest to modernize T&E now, we will continue to see degradation in test capability and increases in cost to acquisition programs and the taxpayer.

In the area of operational evaluation, Joint Vision 2010 continues to define the conceptual framework for how U.S. forces will fight in the future. To develop systems to implement these concepts and to evaluate their effectiveness in a Joint Vision 2010 context is an important new direction for system developers and operational testers alike. As in last year's Annual Report, this report reviews each military system in terms of its contribution to Joint Vision 2010, and each system is evaluated in that context as well.
 
 

RESOURCES ARE IN A CRITICAL STATE

The T&E community strongly supports the Revolution in Military Affairs and the Revolution in Business Affairs. These policy thrusts have important consequences for T&E. The Revolution in Military Affairs is changing the emphasis in military operations to interoperability, systems-of-systems, and information systems. As a result, systems can no longer be tested only in a stand-alone configuration but must be tested with multiple other systems increasing the complexity of the tests and straining the capabilities of existing resources. It is also increasing the workload. The Revolution in Business Affairs is needed to improve test range operating costs, and has both direct and indirect impact on the resources required by the Service test ranges. The T&E infrastructure is deteriorating, and our ability to meet future T&E requirements is getting farther out of reach. T&E infrastructure has been reduced 35 percent since 1985. These reductions have been driven by overriding Service priorities for military readiness and operational support, not by T&E workload, which is robust. This is discussed in detail in the Test Resources section. In FY99, T&E operating and investment funding will be $1 billion less than it was in 1990. Improvements and new capabilities being introduced are not enough to compensate for years of inadequate maintenance and operating funds, loss of personnel and expertise, and loss of productivity due to aging, technologically outdated facilities. The lack of new investment to address the most urgent long-term needs and inadequate test and evaluation operating budgets to meet current workload are due in part to an erroneous perception of excess capacity.

The workload at the Service Operational Test Agencies (OTAs) is climbing and is projected to increase for the foreseeable future. For example, The Navy's OTA, COMOPTEVFOR, has more OT&E programs now than at any other time in its 54-year history.

At the Major Range and Test Facility Base (MRTFB), the workload has remained relatively steady over the past decade. The workload associated with quality assurance, aging, and surveillance testing has grown as fielded systems age.

As the workload has remained constant or increased, the workforce to meet that challenge has decreased. People are a key element of our T&E infrastructure. Our T&E professional workforce is our greatest asset. However, from FY87-FY99, the MRTFB workforce declined by over 9,200 people, about 22 percent. This reduction is roughly equivalent to the reduction in workforce caused by eight base closures. (Typical reductions from a single base closure are approximately 1,200 people).

Cuts have drastically reduced the participation of military personnel in T&E also. Since FY90, we have seen a 36 percent decrease in the number of military personnel directly involved in conducting T&E at Range and MRTFB activities. Loss of military personnel from the T&E community will have grave effects on both developmental and operational test and evaluation. Active military participation in testing is key to understanding how a system will actually be used in combat. The Army is virtually eliminating military personnel from working in the MRTFB. In 1990, the Army had 762 soldiers directly supporting developmental T&E programs. Their numbers have already been reduced by 98 percent. By the year 2001, the Army will have only five military personnel directly supporting developmental T&E. White Sands Missile Range will have none. On the indirect support side the situation is equally serious. In 1990, at their MRTFBs, the Army had 504 institutional military personnel in developmental testing. In the year 2001, they will have 37. White Sands will have two of those 37.

During the last 20 years, DoD's investment rate for T&E facilities has been less than one-third the rate of investment in private industry and an order of magnitude below the investment rate for high-technology industries. Military construction funding for the MRTFB is down 65 percent since 1990. At the same time, investment funding is down by 39 percent since FY90. Our current investment level equates to a replacement rate of 500 years compared to industry rates of 20 to 40 years. In real terms, that means that T&E facilities at places like Arnold Engineering Development Center in Tennessee face declining availability and maintainability. For example, 35 percent of Arnold's aging infrastructure experiences significant loss of capability in a given year.

Reductions in funding are making it very difficult to retain T&E capabilities, and multi-Service reliance agreements were not designed to nor expected to be able to offset these losses. When cuts are made, inter-Service reliance agreements cannot compensate for internal Service T&E reductions. As a result, unique T&E facilities are facing closure.

As an example, Army cuts are creating pressure to close the Aberdeen Pulsed Reactor Facility at Aberdeen Test Center, Maryland. The Army consideration to close this truly unique facility is because of its low utilization. While the utilization level may have been low, the facility is critical to ascertaining the hardness of systems to combined nuclear effects. Replacing the facility would require a large investment for construction and several years of lead-time to obtain environmental assessments and state and local permits.

Workforce reductions, aging facilities, closures, and consolidations-all have contributed to what can only be seen as a deteriorating T&E infrastructure. T&E infrastructure represents only 1.6 percent of total DoD infrastructure value and approximately 20 percent of the acquisition infrastructure value. During the period FY90-FY99, total MRTFB operating and investment funding was reduced by a cumulative total of $5.4 billion. For the most part, the military departments made these reductions in requests to Congress for less RDT&E funding for T&E. Since FY90, the military departments have reduced their RDT&E budget requests for T&E support and investment by a cumulative total of almost $4 billion. Congress cut less than $100 million cumulatively over the same period.

In contrast to the situation described above, out-of-date and incorrect measures of "excess" T&E capacity are cited nonetheless to show that T&E should be reduced further. This is discussed in more detail in the Test Resources section.

Delays to programs are occurring due to our degrading T&E infrastructure. One recent example involves the Rolling Airframe Missile (RAM) Block 1 testing. The RAM is a short-range missile designed for defense of ships against anti-ship cruise missiles. Realistic testing of RAM must be accomplished on the Self-Defense Test Ship, a decommissioned destroyer that is remotely controlled during missile firing tests. Safety considerations necessitate the use of this unmanned ship for close-in intercepts because of the risk of target debris hitting the test ship. Testing was delayed 3 to 4 weeks to make temporary repairs of a leak in the 42-year old hull of the Self-Defense Test Ship. Due to the temporary nature of the repairs, testing was limited to calm seas. This led to further delays in testing. The delays in testing could have been avoided had more up front investment been made in maintenance of the Self-Defense Test Ship.

Other examples of how cuts in T&E are delaying acquisition programs and increasing acquisition costs are given in the Test Resources section.

The next few years will be a critical time for the Department's T&E infrastructure. The resources committed to modernization of T&E capabilities will determine the costs and, to some extent, risk levels for our future weapon systems. I am proud of what we have been able to accomplish to date. We have made investments in test resources count by improving productivity, for example, investment in range instrumentation that requires less manpower to operate. We have reengineered our business processes and are studying options to consolidate range business functions further. We have contributed our share and more to the Department's downsizing in military, civilian, and contractor manpower, and in facility closures and mothballing. Our T&E infrastructure has continued to provide quality T&E support to weapon system acquisition programs. Yet, if we do not invest adequate funds to modernize the T&E infrastructure, we will continue to see a degradation in capability that will show up as increased weapon system costs, program risk, and increased delays in getting the weapons into the hands of the warfighters. The capabilities needed to test the technologies and systems required for Joint Vision 2010 will not be available.
 
 

PROGRESS ON JOINT VISION 2010

Joint Vision 2010 relies on the concepts of dominant maneuver, precision engagement, full-dimensional protection, focused logistics, and the two unifying concepts informational superiority and full spectrum dominance. Three keys to the success of a Joint Vision are that all of the elements of the U.S. joint forces must be able to: (1) work together smoothly; (2) work well as a system of systems; and (3) have confidence that the information base can be used with assurance.

In response to these key elements, I believe the future trend in all operational testing will have to reflect the joint nature of military operations. This will necessitate having the OTAs involved jointly in T&E to an even greater degree than at present. There will have to be increased attention to interoperability, even across Service lines. Lastly, information superiority and information assurance will become an important part of operational testing programs.

Of all the Joint Vision 2010 concepts, precision engagement most benefits from the Department's emphasis on "End-to-End Battlefield Operational Concepts." This is because weapons can successfully engage only when the target acquisition systems provide timely, complete, and accurate information. We must test how well the targeting system is tuned to the weapon capability under realistic combat conditions. Such end-to-end testing while particularly informative is also resource intensive and often operates across Service lines. Under normal conditions the only time all the combat systems are together is during training exercises. As a result, I encourage cooperation between operational testing and unit training exercises at all levels.

Information Superiority

Information superiority will depend on the quality of information. Many systems that we test lack software maturity. This theme runs through many of the system assessments in this Annual Report. We also note the need for deployed network managers for internetted command and control systems, whether on the battlefield or at command elements.

One software problem receiving high level review in the Department is the Y2K problem (software using two digits to represent year dates such that as a result the year 2000 can be confused with the year 1900). I have directed that: (1) operational testing should not start until technical tests demonstrate Y2K compliance; (2) at least a portion of every test, where practicable, simulate rolling over to the new millennium; and (3) there be an evaluation of the operational significance of any interfaces with systems that are not Y2K compliant.

The challenges in developing new information systems does not appear to be unique to military systems. In 1998 the Wall Street Journal reported that "Roughly 50 percent of all technology projects fail to meet chief executives' expectations," according to a survey of 376 CEOs by the consulting firm CSC Index and the American Management Association. Early operational testing of information systems can avoid such problems in defense information systems.

In FY98 my office initiated an assessment of Information Assurance (IA) and Electro-magnetic Environmental Effects (E3) to determine what actions DOT&E might undertake to better address these areas. IA has come to the fore of public and DoD attention during the time of our initiative. Our focus is to develop appropriate OT&E IA and E3 policies that can be effectively applied by all Services in operational testing.

In the area of Information Assurance, my office has met with the Service test agencies and appropriate DoD agencies including Joint Staff, ASD/C3I, DISA, NSA and DIA personnel. This effort has resulted in a clearer picture of the size and nature of the IA task, as well as provided a firm grasp of what is and is not being done to address these problems during system development and testing.

We convened a workshop in August 1998, hosted by COMOPTEVFOR, that brought together the organizations concerned with IA and operational testing to help shape a consensus on a DoD-wide policy regarding IA testing for new systems. A draft policy produced by our office was used to help focus efforts and spur discussion and debate. A follow-on workshop in October of that same year, with the OTAs and Service T&E representatives, continued to define the draft.

In the Electro-magnetic Environmental Effects arena, two imperatives are driving the need to more effectively use the electromagnetic spectrum for military operations: the decreasing bandwidth allocated for exclusive military use; and the application of commercially developed equipment to support worldwide military operations. Uncoordinated or ineffective use of the spectrum, and platforms populated with numerous collocated systems using the spectrum, can result in operational interference and degrade the utility of those systems.

To support effective spectrum usage, DOT&E, in concert with the Service OTAs, is developing a policy for E3 test and evaluation. The policy will support the early evaluation and modeling of potential E3 problems during system development. It will define completion of applicable spectrum certification as criteria to enter dedicated OT&E and also support the analyses of the impact of any emanation limitations to operational employment not only of the system under test but to the other fielding systems that may be affected.
 
 

CINC PARTNERSHIP AND JOINT EXPERIMENTATION

In FY96, the Secretary of Defense asked me to improve dialogue with the warfighting CINCs and provide timely support of CINC requirements. During 1996 we began initiatives through which operational testers can more fully support the users-the CINCs.

Each year my staff and I visit the CINCs to seek ways in which the test community can better serve them. Among the most important issues identified were the need to experiment with ways to improve mobile missile targeting capability and to provide assistance to the evaluation of Joint Warfighting experiments. USACOM has new responsibility for Joint Experimentation. I have established a liaison with USACOM, available to assist him in structuring tests, experiments, exercises, and assessments.

Also, USACOM is now a member of the Senior Advisory Council and the Technical Advisory Board for the Joint Test and Evaluation Program.

Also, this year we have had a major effort supporting the CINCs in designing Y2K exercises and assessments. Working with the Service OTAs, we have organized four teams who work with CINC staff on developing exercises and interoperability tests which demonstrate Y2K compliance and the ability to meet mission requirements through the Year 2000.
 
 

OPERATIONAL FIELD ASSESSMENTS

Since 1996, to support the CINCs, DOT&E, DIA, NSA, and NRO have worked together to provide a coordinated national-level support of CINC requirements within their respective mission areas. The underlying rationale was to give the JCS and CINCs the ability to explore operational concepts and address critical operational issues with responsive, well-coordinated partnership support. Experimentation is central to the program, that is, modest, inexpensive experiments which are conducted relatively quickly. These experiments are called Operational Field Assessments (OFAs).

The demonstration phase completed in FY97 was successful and the Congress appropriated an additional $4 million for FY98 activity. Ten projects were supported as FY98 OFA projects, which are described below in the section on Operational Field Assessments. Significant progress was made on all projects in 1998. For example, the intelligence gathering capability of relocatable over-the-horizon radar at unusually long ranges, with new concepts of operations, is providing a new source of information for SOUTHCOM. This new source of operational intelligence, derived from OFA effort, is now being exploited for incorporation into future counterdrug operations.

All OFAs will be completed within the next few months with remaining FY98 funds. No funds for OFAs were authorized by Congress in FY99.

The FY98 Defense authorization bill required a review of the OFA program, which the J8 contracted out to RAND. The RAND report recommended a new coordination process led by the Joint Staff. We have been following that process in 1998. Each proposed project is recommended by one or more CINCs for consideration by a Working Group made up of representatives from the Joint Staff, the Military Services, and the OFA Partners. In FY98 our effort has been on completing the FY98 projects in accordance with the needs of the CINCs, and with coordination with the Joint Staff and the Working Group.
 
 

EXTERNAL REPORTS ON DOT&E ACTIVITY

DOT&E regularly supports external reviews of our performance. During 1998, the National Research Council (NRC) issued its report on OT&E and Defense Acquisition, and the Defense Science Board (DSB) formed a task force to study testing within the Department.

National Research Council

In May of this year the National Research Council published the first results of their four-year study of testing and defense acquisition. From their fundamental observation that "many design flaws in both industrial and defense systems become apparent only in operational use" and their observations on the value of involving operational testing early in an acquisition program, the NRC recommended a new approach:

Congress and the Department of Defense should broaden the objective of operational testing to improve its contribution to the defense process. The primary mandate of the Director, Operational Test and Evaluation should be to integrate operational testing into the overall system development process to provide as much information as possible as soon as possible on operational effectiveness and suitability. In this way, improvements to system and decisions about continuing system development or passing to full-rate production can be made in a timely and cost-effective manner.
Many aspects of what I have tried to do since becoming director address this recommendation. Early involvement, more effective use of modeling and simulation, combining developmental tests and operational tests, are themes we have articulated and practiced.

Other recommendations by the NRC panel include: (1) there should be a role for operational test personnel in the process of establishing verifiable, quantifiable, and meaningful operational requirements; (2) the DOT&E should have access to funds to augment operational tests when needed; (3) operational evaluations should address overall performance and its ability to meet mission goals; (4) establishing a centralized testing and operational evaluation data archive; and (5) increased use of early, small-scale operational testing.

Defense Science Board

In May the USD(A&T) and I chartered a Defense Science Board Task Force to undertake a broad review of the entire range of activities relating to T&E. This includes developmental test, operational test and the associated processes, policies, and facilities. I have briefed them, and have attended most of their working sessions. This is the most comprehensive review of T&E by the DSB in many years. Their report is due in April 1999.

912 Studies

Section 912c of the National Defense Authorization Act for Fiscal Year 1998 directed the Secretary of Defense to submit to Congress an implementation plan to streamline the acquisition organizations, workforce, and infrastructure. In his response, the Secretary identified as a future "focus area" Integrated Test and Evaluation. The Secretary wrote:

Test and evaluation are essential to the development of high-performing weapon systems. I have outlined five themes to reform and improve the test and evaluation process and better support streamlined acquisition. These themes are:
  1. Early tester involvement, especially the operational tester, in the development of a system to identify potential problems early so that they can be addressed as the system is being designed.


  2. Combining development test (DT) and operational test (OT) activities to enable more efficient use of test resources.


  3. Combining testing with training or field operations to reduce the cost of testing as well as improve its realism.


  4. The use of modeling and simulation (M&S) to support resolution of test issues.


  5. Greater participation in the ACTD process by test personnel and organizations to assist ACTD planning and evaluation and to support ACTD transitions to acquisition at advanced milestones.


With respect to early tester involvement, DOT&E and the operational testers' participation in early planned testing is at an all-time high. Combining developmental test and operational test activities has increased to the point that nearly all programs include a period of combined DT/OT. This brings operational realism to the testing program earlier and helps identify operationally important issues.

Combining testing with training or field operations grew quickly initially but projections now show a slow, downward trend. This indicates that a significant increase in management attention and some increase in resources will be required to make this a more common practice. Testers have complained that training is too uncontrolled and that instrumentation sufficient to the test task is not available in training exercises. These issues can be addressed with additional test planning efforts and improved resources for instrumentation that is less intrusive and interoperable on both test and training ranges.

In addition, the test community has a role in supporting modeling and simulation. In particular, Department-sponsored efforts like JWARS and JMASS will require the kinds of data derived in operational tests. In some cases this data will be needed so that the operational performance of systems is captured in M&S trade-off studies. In other cases, operational test outcomes can be used for M&S verification and validation. My office has stepped up its work in these important areas over the past year.
 
 

LIVE FIRE

In July I proposed a forum to discuss and consider LFT&E policy. Our premise has been that the requirements for survivable and lethal systems have not diminished-if anything, they are increasing. Greater collaboration among OSD and the Services can help LFT&E policy contribute to acquisition reform, and changes to policy can improve overall effectiveness consistent with statute.

Together with the Army, Navy, and Air Force Test and Evaluation executives we drafted new regulations under existent Live Fire Test legislation, Title 10, Section 2366, and reached consensus in late 1998. This draft has been forwarded to the Defense Acquisition Policy Working Group for further incorporation into appropriate regulations. It incorporates a number of benefits including the clarification of responsibilities and terminology, and addresses the live fire testing of commercial off-the-shelf and non-developmental items to be used by our military forces.
 
 

FUNDING FOR THE OFFICE OF THE DIRECTOR, OT&E

Beginning in FY99, my office budgeted and received approximately an additional $2.5 million a year for increased capability in the three areas of missile defense, chemical and biological defense, and battlefield digitization. These areas are either areas of increased emphasis as a result of the QDR or part of the Service response to Joint Vision 2010 and the Revolution in Military Affairs. The increase, while small in terms of overall defense spending, is a large fraction of DOT&E funding. Resources in my small office - both people and dollars - are stretched very thin due to downsizing, and even a small increment makes a difference. In concert with the Revolution in Business Affairs, we have outsourced work and reduced OSD staff.
 
 

OPERATIONAL TEST AGENCY (OTA) ACTIVITIES

The Service OTAs are under severe budget pressures. They are doing very important work for which there is no substitute, and their workload, as mentioned before, is increasing. The Army's OPTEC is supporting rapid acquisition programs as well as many other programs large and small. Further, in 1997, OPTEC was assigned new responsibility for live fire evaluation and new responsibility to consolidate evaluation for both DT and OT Army wide. While I strongly support these new responsibilities, OPTEC did not receive adequate resources for these new responsibilities and has been cut since even further with still further cuts pending. AFOTEC has new responsibilities for testing in streamlined acquisition, in support of the Air Force Battle Labs, in special programs and support to the Unified Commands, and as the lead OTA on certain DoD-wide programs.

There are several other new responsibilities that are adding to the workload at the OTAs. In accordance with SecDef direction, all the OTAs are now supporting ACTDs to bring operational test insights to ACTDs as early as possible. The OTAs are also supporting the CinCs in their assessments associated with the Y2K problem. In addition, the SecDef principles of early involvement, combining testing where possible (e.g., DT with OT), doing testing with training exercises (to save money and increase operational realism), and making better use of modeling and simulation, all require new effort from the OTAs. The conduct of early operational assessments done early in a program can save millions of dollars, and all the OTAs are doing such assessments. However, they are being asked to take on these responsibilities without adversely impacting their regular work, without new resources, and when their budgets are being cut quite severely.

A new challenge for the operational test community is developing appropriate test programs under acquisition reform. For example, we can and should take advantage of contractor testing performed on commercial off-the-shelf (COTS) and non-developmental items (NDI), if that testing is realistically stressing. However, some COTS programs may not experience adequate operational or live fire testing solely at the contractor level. The C-130J and the T-3 are examples. My policy, and that of the OTAs, is that military systems must be tested in accordance with how they are intended to be used, not with how they are purchased. Systems used by our troops must be tested in realistic combat conditions so that military commanders and users can know what to expect. Operational tests establish both the capabilities and limitations of military systems in operational conditions.

The work of the Service OTAs is certainly one of the best bargains, dollar for dollar, anywhere in the Department of Defense. Through the operational tests and evaluations conducted by the OTAs, acquisition programs and warfighters know exactly what works, what doesn't, and why. The OTAs illuminate both the strengths and weaknesses of new military systems, and help soldiers, sailors, airmen, and marines understand the characteristics and limitations of military equipment for operational use.

The Service OTAs are all very well respected and valued for their contributions. To continue their work, the OTAs need autonomy, stability, and independence. Stability is easily threatened by continuing funding cuts. To do their mission well requires high-quality professionals. Attracting and retaining high-quality professionals requires the OTAs to be stable, secure, and well supported at the highest levels in the military Services.
 
 

NON-MAJOR SYSTEMS

U.S. Code, Title 10, Section 139 (b)(3) states "The Director shall monitor and review all operational test and evaluation in the Department of Defense. In previous Annual Reports, we have not reported on testing of systems that are not major systems under OSD oversight. A new section of this report provides a summary of the most important OT&E activities conducted by the Services on non-major weapon systems as they supported full-rate production acquisition decisions during 1998.
 
 

MAJOR REPORTS

In FY98 and to date in 1999 there have been nine formal reports on the operational test and evaluation and live fire test and evaluation of weapons systems for submittal to Congress. These reports are bound separately and are available on request as an annex to the classified versions of this Annual Report. These are: Ship Self Defense System B-LRIP, ATACMS Block 1A B-LRIP, AN/ALR-67 (V)3 Certification, LFT&E of Armor Piercing Cartridges, JSOW Baseline B-LRIP, CSSCS B-LRIP, CCTT B-LRIP, B-1B Block D CMUP B-LRIP, and SMART-T B-LRIP.

This Annual Report responds to statutory requirements. No waivers were granted to subsection 2399 (e)(1) of Title 10, United States Code, pursuant to subsection 2399 (e)(2). Members of my staff and I will be pleased to provide additional information as appropriate.



Philip E. Coyle
Director



Return to Table of Contents



NEWSLETTER
Join the GlobalSecurity.org mailing list