Military

Military

Military


FY 1999 Budget
Index

DOT&E FY99 Annual Report
Table of Contents

Director's Introduction
DOT&E Activity and Oversight List
Test Resources
Non-Major Weapon Systems OT&E
Operational Field Assessments (OFAs)
Live Fire Test and Evaluation (LFT&E)
Year 2000 (Y2K) Operational Evaluations

ARMY PROGRAMS

NAVY PROGRAMS

AIR FORCE PROGRAMS

OTHER DEFENSE PROGRAMS

JOINT TEST & EVALUATION PROGRAMS

Glossary of Acronyms


DOT&E Home Page





Notice

This is an unclassified version of the FY 1999 Annual Report of the Director, Operational Test and Evaluation. In addition to this version, a classified annex is being submitted to the Secretary of Defense and the House and Senate Defense Committees pursuant to the provisions of Section 139, Title 10, U.S. Code.

This unclassified version has been published in response to Section 3013 of the Federal Acquisition Streamlining Act of 1994.



NEWSLETTER
Join the GlobalSecurity.org mailing list


 
 DOT&E Director, Operational Test & Evaluation  
FY99 Annual Report
FY99 Annual Report

DIRECTOR'S INTRODUCTION


The responsibilities and functions of the Office of the Director, Operational Test and Evaluation (DOT&E) increased significantly in 1999, including assuming Office of the Secretary of Defense (OSD) management of test and evaluation facilities. This introduction reports on those changes, the urgent need for new investment in test and evaluation (T&E), both material and personnel, and on recommendations and plans to address such shortfalls. As in the last two Annual Reports, the main body of this report reviews major military systems in terms of their contributions to Joint Vision 2010.

In June 1999, the Secretary of Defense approved a reorganization of test and evaluation within OSD. The Secretary transferred the preponderance of test and evaluation functions and resources to my office, including oversight of test ranges and facilities, test investment, and sponsorship of other test related programs. This reorganization will dramatically improve the ability of my office to address the declining state of test and evaluation capability in the Department of Defense. For several years these Annual Reports have highlighted the needs at T&E centers and test ranges across the country.

Funding for Department of Defense (DoD) test ranges continues to be a major concern. My 1998 annual report described test and evaluation resources as being in a critical state. While I believe the department has recognized now that it has gone too far in cutting test resources, there is continued pressure to cut further. New funding is necessary to revitalize the country's testing capability to keep up with the new technology in the weapons themselves. A major challenge will be to shape and direct that technological investment if new funding can be found.

Finally, outside reviews of DoD Test and Evaluation continue to recommend changes to improve the overall acquisition process. During 1999, the Defense Science Board (DSB) conducted a broad review of activities relating to T&E. Some of the DSB recommendations support previous suggestions made by the National Academy study in 1998. Some were reinforced by research reported by the Academy this year.

 

I. CHANGES IN OSD T&E OVERSIGHT WILL STRENGTHEN THE ROLE OF OPERATIONAL TESTING AND EVALUATION

As recommended by both Secretary Cohen and Secretary Perry, early involvement by operational testers provides early feedback to help acquisition programs address operational issues. As the Undersecretary of Defense for Acquisition Technology and Logistics (USD(A,T&L)) explained in a letter to the chairmen of the four Defense Committees, "I have advocated for many years that serious testing with a view toward operations should be started early in the life of a program. Early testing against operational requirements will provide earlier indications of military usefulness. It is also much less expensive to correct flaws in system design, both hardware and software, if they are identified early in a program. Performance-based acquisition programs reflect our emphasis on satisfying operational requirements vice system specifications."

To implement these policies, the Secretary of Defense decided to disestablish the office of Director, Test, Systems Engineering and Evaluation (D,TSE&E) within the Office of the Under Secretary of Defense for Acquisition and Technology, and move the following responsibilities to DOT&E:

  • Central Test and Evaluation Investment Program.
  • Joint Technical Coordinating Group for Munitions Effectiveness.
  • Joint Technical Coordinating Group on Aircraft Survivability.
  • Threat Systems Office.
  • Precision Guided Weapons Countermeasure Test Directorate.
  • Test and Evaluation Independent Activities.

The Secretary further directed that I, as the DOT&E: (1) exercise no responsibility with respect to Developmental Test and Evaluation of a defense acquisition program, except to advise those who do have such responsibility; (2) ensure that all contractor support is consistent with limitations of 10 U.S.C. 2399; (3) ensure that all actions taken are without regard to whether the particular range or equipment is used for DT&E or OT&E; and (4) ensure that no action is taken that creates a conflict, potential conflict, or appearance of conflict of interest with respect to DOT&E's role which is independent of DT&E.

Oversight of developmental tests remains in USD(A,T &L). Joint Test and Evaluation (JT&E) also remains in USD(A,T&L). This activity was considered a special case because, while JT&E projects are not acquisition projects in and of themselves, they should help identify needed acquisition programs. However, the JT&E program was specifically chartered to provide a mechanism for joint operational tests that support the Commanders-in-Chiefs (CINCs) and the Services in developing more effective ways of employing fielded systems in a joint operational environment. To retain an operational test focus, the Secretary has determined that JT&E projects will be jointly chartered by the Director, Strategic and Tactical Systems (D,S&TS) within USD(A,T&L) and DOT&E. Also, as in the past, DOT&E will approve all JT&E program test plans jointly with D,S&TS. The Joint T&Es are reviewed in the main body of this report.

The Secretary also directed that DoD Directive 3200.11, "Major Range and Test Facilities Base (MRTFB)," be revised appropriately to reflect the realignment of responsibilities to reflect that DOT&E will establish policy for and composition of the MRTFB, and plan, program, and budget for the Central Test and Evaluation Investment Program.

In addition, the DOT&E charter (DoD Directive 5142.1) is being updated to reflect these changes.

 

II. T&E INFRASTRUTURE NEEDS NEW INVESTMENT

As the downward trend in T&E resources has taken its toll, my annual reports have expressed increasing concern. Last year I described how a decade of downsizing in T&E had begun to increase costs and cycle times in acquisition programs and pointed out that this was clearly a wrong direction.

The past year has seen some remarkable changes in test and evaluation. First, people in the Military Departments, Congress, OSD, and industry realized that test and evaluation had been cut too much in the previous decade and tried to reverse the trend. This was reflected in Military Service budgets, congressional marks, industry comments, and in a very important DSB report. All worked to reverse the downward trend. Unfortunately, as budget pressures mount, T&E continues to be cut by either the Services at the installation level or the acquisition programs at the programmatic level. Either way, modernization gets put off another year. This year, for example, about $110 million was taken from the test line of the F-22 program. Year after year, the budgets for Service modernization, research, and/or testing stretch out and sink. Even modestly improved budgets for the next few years will not begin to make up for the impact of the cuts in the 1980s and 90s.

As stated last year, we must invest to modernize T&E now lest we continue to see degradation in test capability and delays and increases in cost to acquisition programs and the taxpayer. As the Defense Science Board Task Force on Test and Evaluation found, "The focus of T&E should be on optimizing support to the acquisition process, not on minimizing (or even 'optimizing') T&E capacity." We need to also provide special protection to those unique national test capabilities for which there are no substitutes.

Modernization is needed for a second reason-the very nature of test and evaluation is changing. New types of military equipment under test, as well as the types of equipment we use for testing, are changing from those in the past.

In T&E we will see new roles for new technology such as computers and digitization, lasers and high power microwaves, multi-spectral sensors and detectors, modeling and simulation, and space systems. While some work is being done in these areas, I believe that we are underestimating the new investment needed in T&E to deal with these technologies. These technologies, already in widespread use in military systems, will become even more widespread in the systems we test in the future.

New investment in T&E is needed to help us perform tests in new ways and achieve productivity we could hardly have imagined a decade ago. The key point is that the capabilities needed to test the technologies and systems required for Joint Vision 2010 will not be available unless we start investing now.

A continuing challenge for the test community will be developing appropriate test programs under acquisition reform. For example, commercial off-the-shelf and non-developmental items testing required attention to ensure that military systems are tested in accordance with how they are intended to be used, not with how they are purchased. Evolutionary acquisition, "evolution" of requirements, and interoperability are receiving new focus. We are developing a detailed policy on interoperability. On July 14, 1999, the Vice Chairman of the Joint Chiefs of Staff and the Under Secretary of Defense for Acquisition, Technology and Logistics implemented requirements in these areas. Where feasible, new requirements intended for evolutionary acquisition will be specified for the baseline and for each subsequent element; and interoperability will be a Key Performance Parameter.

Operational tests establish the capabilities and limitations of military systems in operationally realistic combat conditions. Each evolution should represent an increase in total mission capability. To realize the benefit of evolution, as Charles Darwin first noted, requires a rigorous "natural" selection process. Evolutionary acquisition will require such "survival-of-the-fittest testing"; i.e., side-by-side comparison tests against the best that already exists.

1. Investment in Operational Test Agency (OTA) Activities.

The Service OTAs continue to be under severe budget and personnel pressures. They are doing very important work for which there is no substitute, and their workload continues to increase.

The Army has reorganized its T&E activity in a way that served as a model for the DSB recommendation to consolidate government development and operational testing. The new Army Test and Evaluation Command was established in October 1999. Its success will depend on adequate resources for these new responsibilities and the level of independence it can maintain.

Army test and evaluation resources have been steadily declining while test requirements continue to grow. In addition to the visible loss of funding and manpower, the often ignored civilian and military grade reductions in Army T&E organizations have continued to reduce the experience of its T&E work force. Currently not funded in FY00 are 35 acquisition category (ACAT) II-IV operational tests and six FOT&E events. The Army can fund only 26 percent of the $71.6 million required to execute ACAT II-IV operational tests, and less than 30 percent of the $7 million to execute follow on tests and evaluation. Consequently, without relief, the Army will not have funds to conduct FOT&E on systems such as the Close Combat Tactical Trainer (CCTT), the Secure Mobile Anti-jam Reliable Tactical-Terminal (SMART-T), and the M1A2 Systems Enhancement Program.

AFOTEC has new responsibilities for testing in streamlined acquisition, in support of the Air Force Battle Labs, in special programs and support to the Unified Commands, and as the lead OTA on certain DoD-wide programs. For example, a 35 percent reduction in FY00 of Theater Missile Defense Family of Systems OT&E has jeopardized AFOTEC's ability to execute early involvement activities essential to ensuring interoperability of those systems.

With the current OT&E requirements and existing funding levels, in FY02, AFOTEC will be unable to conduct all of the OT&E currently planned. With testing requirements for over 200 programs, by FY02, AFOTEC will only be able to support OT&E on the Air Force's top ten to twelve programs.

At Navy COMOPTEVFOR, resources-both manpower and funding-are being stressed beyond their limit to support recent increased emphasis on system interoperability, information assurance and electromagnetic environmental effects testing, as well as numerous acquisition reform initiatives. COMOPTEVFOR, supported by 161 operational test directors/coordinators (65 percent of authorized manning), currently has responsibility to conduct operation and evaluation on 389 programs that include 64 ACAT I, 45 ACAT II, 121 ACAT III, 126 ACAT IV, 33 NON ACAT programs as well as four Advanced Concept Technology Demonstrations and five Advanced Technology Demonstrations.

The work of the Service OTAs continues to be one of the best bargains, dollar for dollar, anywhere in the Department of Defense. They still need autonomy, stability, independence and support from the highest levels in the military Services. The workload at the OTAs continue to track the upward trend reported last year. In addition, DOT&E policy initiatives on Electromagnetic Environmental Effects and Information Assurance will add workload. Early involvement, combining testing where possible (e.g., DT with OT), doing testing with training exercises (to save money and increase operational realism), and making better use of modeling and simulation, all continue to require new effort from the OTAs. They are being asked to take on these responsibilities without adversely impacting their traditional work, without new resources, and when their budgets are being cut quite severely. As a result, testing of systems not overseen directly by OSD have begun to falter. Often, there is little or no money for operational testing of the non-major systems.

As Secretaries Schlesinger, Cheney, Carlucci, Weinberger, Rumsfeld and Laird noted in a letter to Senators Lott and Daschle (October 6, 1999), ".the history of maintaining complex military hardware without testing demonstrates the pitfalls of such an approach." As an OTA commander puts it, our "ultimate customer is the soldier-our sons and daughters-who will judge our efforts with their lives and their mission accomplishment. This is a sacred trust which will not be compromised." The Service test agencies need support more than ever. The USD(A,T&L) reinforced this view in a speech delivered September 22, 1999. He said, "When we begin to think of testing as an integral part of the procurement process and less as a final, pass/fail exam, we realize that, if we can begin operational (user) testing much earlier, we can drastically shorten our weapon cycle times." In addition, the USD (A,T&L) has issued new guidance to the Services requiring OTA participation in all acquisition programs.

2. New Investment to Improve Efficiency is Required to Meet Testing Workload.

At the Major Range and Test Facility Base (MRTFB) the workload has remained robust, but the work force to meet that challenge has decreased. People are a key element and our T&E professional work force is our greatest asset. However, from FY90-99, the MRTFB work force declined by over 11,500 people or about 26 percent. By the beginning of 1999, the Army had reduced the number of soldiers directly supporting developmental T&E programs by 98 percent since 1990. In 1990, 762 soldiers supported such activity. Last year I reported that by the year 2001 the Army would have only five military personnel directly supporting developmental T&E. The current plan is for there to be nine.

During the last 20 years, DoD's investment rate for its T&E facilities has been less than one-third the rate of investment by private industry in their facilities and an order of magnitude below the investment rate for high-technology industries. Military construction funding for the MRTFB is down 65 percent since 1990. Between FY90-99, total MRTFB operating and investment funding was reduced by a cumulative total of $5.6 billion, most of which represented a reduction in the requested funds by the military departments. Congress cut less than $100 million cumulatively over the same period. In fact, Congress has in recent years and, in many cases, added funds to the budget.

Delays to programs are continuing to occur due to our degrading T&E infrastructure. Reliability problems at the Holloman High Speed Test Track caused a one-month delay in Theater High Altitude Area Defense testing in FY96, a two-month delay in Patriot Advanced Capability-3 (PAC-3) testing in FY97, and four-month delays to both PAC-3 and Navy Standard Missile testing in FY98. In late FY98 and early FY99, the rail cracked in four places. The refurbishment effort is projected to take three years and cost $3 million. In FY99, the Air Force planned to fund $1.3 million of this effort; however, budget reductions reduced this amount to $0.7 million. These are not large sums compared to the nation's investment in missile defense, but without them we delay critical learning about warhead effectiveness in the nations missile defense programs.

Other examples of how cuts in T&E are delaying acquisition programs and increasing acquisition costs are given in the Test Resources section.

3. New Testing Resources Needed for Joint Vision 2010 and Beyond.

Joint Vision 2010 depends on dominant maneuver, precision engagement, full-dimensional protection, focused logistics, information superiority and full spectrum dominance. Implementing these concepts will depend on new technology as well as better handling of old technologies such as software. The most difficult concept to implement will probably be the developing notions on information superiority. Among the technologies on which Joint Vision 2010 depends are many that will require improved testing capability.

  • Missile Defense.
  • Unpiloted Vehicles.
  • Directed Energy Weapons.
  • Multi-spectral Stealth.
  • Remote Sensing.
  • Precision Location.
  • Distributed Simulation.
  • Digitization.
  • Information Technologies.
  • Situation Awareness.
  • Chemical and Biological Defense.
  • Space Systems.
  • Hypersonics.

In contrast to the development of these technologies for weapons, the development of testing technologies have no research or development dollars. Test and evaluation research program elements are needed to develop new test instrumentation and test sensors. Today, none exist in the Services and only one in OSD. Among the testing techniques that need to be developed are:

  • Ballistic Missile Interceptor/Target Position Location and Telemetry Instrumentation. Accurate three-dimensional position information is currently available only when intercept is achieved. When the interceptor misses the target, location is extracted from the ground-based range radar or optical tracking systems and the inertial navigation system in the interceptor. This method results in larger than desirable measurement errors. Miniaturized, light-weight, Global Positioning System (GPS) receivers have shown promise as on-board instrumentation. In general, it appears that the amount of on-board engineering information collected during test flights of Ballistic Missile Defense systems is considerably less than that collected during development of other acquisition programs. This resulting lack of data could seriously hamper fault analysis and engineering development.
  • Ground Testing of Air and Space Components. The amount of learning that can be accomplished cheaply on the ground, rather than in expensive flight tests, has been severely underestimated by some missile defense programs. Hardware-in-the-Loop simulations have avoided costly failures and helped developers better understand how the system under test really works. Much more can be done in this area with overall cost and schedule savings.
  • Distributed Simulation. The ability to learn at both the system level and at the component level is severely hampered when the inputs and effects of the elements of the system (or component) are not simulated. Advances in distributed simulation have been demonstrated in the (JT&E) program, which, if implemented, could accelerate the fielding of new programs. Linking test ranges, laboratories, and factories to capitalize on distributed simulation will require new networks.
  • Common Testing and Training Modeling and Simulation. This has been an area of special congressional interest. Last year Congress added several million dollars to our budget to fund projects in common testing and training projects. Too often programs delay the funding of training systems until after the full-rate production decision. Such an approach postpones training costs in the short run, but in the long run it costs more. The simultaneous development of the system and the trainer allows for early testing and learning, better software testing and operational concepts exploration than without the trainer. In the budget process it may seem desirable to delay the cost of developing a trainer by scheduling it after full-rate production. It may also be argued that waiting until the system is fully developed before beginning work on the trainer saves redesign costs. However, in a budget process that regularly underestimates the actual eventual procurement cost by 50 percent, one reason costs overrun is because we are not learning enough about the system and its use early in development. Simultaneous development of trainer and system can accelerate learning and reduce costs.
  • Embedded Testing and Training Instrumentation. A similar long-term view would also change the way we design the instrumentation for new systems. Our new systems should be expected to last a long time. In the past, the best of our military systems have lasted 40 to 70 years. Examples include the C-130, which will be 79 years when (and if) it retires in 2030, the UH-1, 49 years, and the Army 2.5-ton truck 67 years. Such systems remain in the inventory because their original design was sound and could be upgraded/updated at reasonable cost. Future acquisition systems will be upgraded also, and embedded instrumentation will make that an easier process. Embedded instrumentation gives early warning of incipient failure, and makes embedded training easier. Embedded instrumentation makes maintenance and logistics easier to automate. It also makes life extension programs easier to implement. USD(A,T&L) has begun a process to make total life cycle cost and total cost of ownership a new focus. DOT&E will support this in every way possible.
  • Common Instrumentation Across the Ranges. One approach to a coherent and coordinated national test strategy is to invest now in upgrades that can be used by a number of ranges. Common instrumentation, or at least common protocols for instrumentation, will support interoperability among the ranges. This, in turn, will support distributed simulation and joint distributed test and engineering networks.
  • Testing and Training With Countermeasures. The ability to test or train with countermeasures is quite limited, and yet recent experience in Kosovo and other theaters have shown again how effective countermeasures can be. There are significant deficiencies in the extent to which we incorporate countermeasures into the testing of and training with weapons systems and equipment. For example, obstacles that limit the use of countermeasures in tests, training exercises, and battlefield simulations include the availability of new countermeasures equipment, their expense, restrictions due to side-effects -smoke as a pollutant, electronic jamming interference with television and radio reception, GPS jamming interference with commercial aviation, reluctance to incorporate countermeasures in simulation-and finally the simple fact that assessing the effects of countermeasures is difficult. For the most part, testing countermeasures against precision guided weapons and other military hardware, particularly those employing electro-optical and infrared technologies, has been limited to restricted, one-on-one scenarios using laboratory equipment and/or dedicated, one-of-a-kind field test assets. Weapon system developers are often reluctant to incorporate countermeasures hardening in early design due to the uncertainty regarding the precise nature of the threat and perceived higher cost and risk.
  • Testing and Training in Chemical and Biological Environments. The ability to test or train in Chemical and Biological Threat Environments cannot now be simulated adequately.
  • Variety of Physical Environments for Testing and Training. Desert environments or high chaparral dominate many U.S. test ranges such as White Sands, Yuma, the National Training Center, China Lake, Nellis AFB, Twenty Nine Palms, and Edwards AFB. These are important and productive ranges. However, we also need areas that can represent Bosnia or Korea or tropical regions. Training and testing in those environments is important also. Accordingly, the Arctic Region Test Center at Ft. Greeley, AK is especially valuable and must be well supported. Also, the Tropic Regions Test Center in Panama has been closed down. The Army intends to replace the capability in Panama with new centers in Hawaii, Puerto Rico, and with university contract work in Panama. These plans must be implemented.
  • Methodologies to Integrate Test and Training Activities. The Secretary has made it a focus theme to combine tests and training exercises. The complex environments needed to test new equipment are often available (at reasonable incremental cost) during the conduct of large scale training exercises. This does not mean that combining tests and training is easy. The objectives and requirements for test and training can be different, and without close cooperation either test or training can fall short of meeting their objectives.

There is a second reason to support such integration. There is increased attention to interoperability, both across Service lines and in coalition operations. USD(A,T&L) has formed a new office to look at this issue. In the future all operational testing will reflect the joint nature of military operations. This will necessitate having the OTAs involved jointly in T&E to an even greater degree than at present.

Of all the testing challenges presented by the Joint Vision 2010 concepts, precision engagement continues to be one of the most challenging. It requires a system-of-systems approach. In the operational test community we think in terms of "End-to-End Battlefield Operational Concepts." This is because weapons can successfully engage only when the targeting systems provide timely, complete, and accurate information. We must test the targeting system together with the weapon capability under realistic combat conditions. Such end-to-end testing, while particularly informative, is also resource intensive and often operates across Service lines. Under normal conditions the only time all combat systems are together is during training exercises. As a result, DOT&E will encourage cooperation between operational testing and unit training exercises at all levels.

Lastly, information superiority and information assurance should become an important part of operational testing programs and training programs.

4. New Resources for Testing Information Superiority is a Key Challenge in Achieving Joint Vision 2010.

DoD is investing roughly $40 billion a year on software-intensive systems to maintain military superiority and improve operational efficiency. As the Department increases its dependence on information, its transfer and utilization, DoD has become a high priority target for information warfare. The number and complexity of attacks on DoD systems is rising, and highlights the critical need for Information Assurance (security) of our software intensive systems.

In FY98, my office initiated assessments of Information Assurance and Electromagnetic Environmental Effects to determine what actions DOT&E might undertake to better address these areas.

In the area of Information Assurance, my office has now held four policy workshops-the latest at the Naval Security Group in Pensacola, FL. Service test agencies, Service acquisition communities, and appropriate DoD agencies including Joint Staff, ASD/C3I, DISA, NSA and DIA personnel have now been involved. The information assurance policy was signed on November 17, 1999. This complies with FY00 congressional direction. The new risks necessitating information assurance policies and attention will call for increased efforts in the development and testing of systems.

In the Electromagnetic Environmental Effects arena, last year's report noted the need to better understand electromagnetic effects on military equipment. DOT&E, in concert with the Service OTAs, developed a policy for Electromagnetic Environmental Effects test and evaluation. The policy was signed on October 25, 1999, in coordination with the Service T&E offices, the Joint Spectrum Center in ASD/C3I, and the OTAs. In addition, the Service's acquisition offices also concurred.

DoD is justifying major reductions in force structure and revised tactics based on high expectations of warfighting efficiencies through information exploitation. These assumptions are unproven and certainly, as we stand today, unrealized. These efficiencies presume a high degree of interoperability. The Department has consistently underestimated the difficulties of achieving such interoperability, and this will be an area of new emphasis for the operational test community.

5. Investment in Simulation is Key to Improving T&E Capability.

Information from instrumented experiments and tests are critical to validate future concepts, and for the high fidelity simulations which the Department has come to rely on more and more. Traditionally, test data has been treated simply as part of the customer project, related only to the system or test at hand. New technology now makes it possible to support on-line archives that can help the development of future systems and support operational systems. The new technology arises from the combination of computational capabilities, dramatic gains in storage technology and costs, high speed networking, and data mining technology.

Such simulations will create computational environments more powerful than at any one of the current DoD High Performance Computing Centers alone. High performance networking will be the key. This networking must be flexible, high speed, and built on a trusted security model that works in this environment. At this time, the DREN is the only such capability available within DoD.

A similar network or an extension of DREN to connect ranges, laboratories, and contractors will be needed. The test and evaluation enterprise of the future will require expertise distributed across academic, industry, non-DoD government, and global centers. This will require innovative development and integration of technologies that can effectively utilize the Internet, the World Wide Web, and other networks to bring the full capabilities of a broader community to acquisition programs.

The test and evaluation community has demonstrated the potential that high performance computing can play. Examples from the DoD High Performance Computer Modernization Program include the following:

1. The White Sands Missile Range uses High Performance computing to develop algorithms to track multiple objects in real time. This new process should represent considerable program timesaving since previous methods required manual post-test film reading.

2. The Arnold Engineering Development Center uses High Performance Computing to reduce turnaround time for Turbine Engine Test Data from days to a few minutes. In the future, much of this data will be delivered in real time.

3. The Aberdeen Proving Ground uses High Performance Computing to simulate the motion of tanker trucks filled with liquid. This capability will be used to analyze the stability of future vehicles in the design stage so that the designer will have confidence that the first manufactured vehicle will meet all required stability margins.

A number of programs are already benefiting from application of High Performance Computing to provide a virtual testing capability. These include the Joint Strike Fighter, the Joint Theater Missile Defense program, the Theater High Altitude Area Defense missile program, the F-22 aircraft, and the SEAWOLF Submarine.

Much of the progress made thus far has been through the DoD High Performance Computing Modernization Office funding High Performance Computing Centers at T&E sites. Funding for the DoD High Performance Computing program must be sustained for test and evaluation as well as for research and development.

In FY98, in response to the Conference Report that accompanied the National Defense Authorization Act for Fiscal Year 1996, the Office of the Secretary of Defense (USD(A,T&L)) submitted to the U.S. Senate and House of Representatives, a DoD Test and Evaluation High Performance Computing Modernization plan. This plan presented T&E requirements for High Performance Computing, including hardware, software, networks, and management and operations. The plan also included an approach for implementation, including the schedules and funding required to modernize High Performance Computing capabilities within the DoD Test and Evaluation Community.

 

III. OUTSIDE REVIEWS OF DOT&E ACTIVITY CONTINUE TO RECOMMEND STRENGTHENING T&E

In 1998, the DSB formed a task force to study testing within the Department, and in the fall of 1999 issued its report. In addition, the National Research Council (NRC) issued a second volume of its report on OT&E and Defense Acquisition. This second volume provided background material to the main body of the report published in 1998.

The primary recommendation of both reports is for earlier, more thorough involvement of the operational test community in the requirements process. Overall, the reports from the Defense Science Board and the National Academies support each other to a considerable degree:

  • DSB recommends centralized management of test facilities; NRC recommends a centralized archive of test results.
  • DSB recommends improved software testing; NRC recommends specific improvements to software testing.
  • DSB recommends a role for operational testers in the formal requirements generation process; NRC recommends the same.
  • DSB recommends early involvement; NRC recommends the same.

Both reports took the view, and the DSB report made it explicit, that changes in T&E should be designed to optimize the acquisition process not minimize T&E.

Defense Science Board (DSB) Report

The National Defense Authorization Act for Fiscal Year 1998 directed the Secretary of Defense to submit to Congress an implementation plan to streamline acquisition organizations, work force, and infrastructure. In his response, the Secretary identified Integrated Test and Evaluation as a future "focus area." The Secretary wrote: "Test and evaluation are essential to the development of high performing weapon systems. I have outlined five themes to reform and improve the test and evaluation process and better support streamlined acquisition."

The Secretary then identified as themes:

  • Earlier tester involvement to identify potential problems early so that they can be addressed during design.
  • Combining development test and operational test activities.
  • Combining testing with training or field.
  • The use of modeling and simulation (M&S) to support testing.
  • Greater participation in the Advanced Concept Technology Demonstration.

The DSB Task Force was to help as part of the formation of the Secretary's response to Section 912c. In May 1998, USD(A,T&L) and DOT&E chartered a DSB Task Force to undertake a broad review of the entire range of activities relating to T&E. This included developmental test, operational test and the associated processes, policies, and facilities.

The recommendations generally fall within four main areas:

  1. Test and Evaluation and the Requirements Determination Process.
  2. Developmental and Operational Test Process.
  3. Modeling and Simulation in Support of the Test and Evaluation Process.
  4. Test and Evaluation Facility Reengineering.

DSB recommendation 1. A Test and Evaluation Role in Requirements Determination.

The Defense Science Board recommended that testers be part of a team to assist in the development of the Mission Needs Statement (MNS) and the Operational Requirements Document (ORD). The year before, the NRC panel also concluded that there should be a role for operational test personnel in the process of establishing verifiable, quantifiable, and meaningful operational requirements. It appears that, at least recently, every time an outside group looks at the situation, they conclude that there should be test input in the requirements setting process. This is more relevant now than ever before because the requirements setting process is changing. Requirements are no longer viewed as a fixed set; rather cost has become the independent variable and the articulated requirements are negotiable in order to make the system affordable. In such an environment, the operational consequences of the trade off of capability for cost savings become very important. In addition, with the institution of time-phased requirements, the role of the tester in time-phased testing of these requirements becomes important.

The DSB also suggested that a Preliminary Test and Evaluation Plan be developed in conjunction with the Mission Need Statement. In addition, the DSB suggested that there be an early operational assessment based on the ORD and an operational scenario before Milestone I (no actual hardware would be involved). My office will support any and all such efforts.

DSB recommendation 2. Developmental and Operational Test Should Consolidate.

The Defense Science Board recommended that Service DT and OT organizations be consolidated to include integrated planning, use of models, simulation, and data reduction. They also recommended that OSD T&E organizations be consolidated and, this has been done as reported above.

Of the individual Services, the Army has already reorganized to consolidate its developmental and operational testing organizations. The success of this will depend critically on how well the Service provides the new organization with resources.

DSB recommendation 3. Modeling and Simulation Should Support Test and Evaluation.

The Defense Science Board recommended that the acquisition process require and fund an M&S plan at the earliest practical point in a program. They recommended that oversight and direction of M&S Development and Employment for T&E be carried out by an OSD T&E organization.

Since the DSB Task Force, a survey of program management offices indicated that fewer than half of the programs had an M&S staff, fewer than half had a "collaborative environment" for M&S, and less than two-thirds had an M&S plan or mentioned M&S in the contract.

DOT&E has encouraged using M&S throughout the life cycle of systems. One step in actually doing that would be to require pre-test prediction of results based on models and simulations, with the input to those models in turn based on previous testing.

In a second step to support the Defense Science Board recommendation I will require that future tests make an effort to identify the required input values used to characterize the systems represented in major force-on-force models, with emphasis on the major J-models (Joint Modeling and Simulation System, Joint Simulation System, and Joint Warfare System). The scale of activity of Live Fire lethality/survivability tests, operational tests and CINC exercises often do not match the input required for models. Nevertheless, the processes in place do not take enough advantage of the realistic field testing that has occurred.

Operational tests can collect data to provide realistic input to these models. The Joint Warfare System (JWARS) is a case in point. A major component of the improvements JWARS will represent over the legacy models will be in its handling of Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR). Most of the model C4ISR systems have operational test or field data available on both performance and operational availability, yet this data is not currently being used. On a more promising note, the methodology of the Joint Munitions Effectiveness Manuals (JMEMs), which are used operationally to match the number and types of weapons to targets, have been discussed as a means for forecasting the effectiveness of future systems in the JWARS model. In the reorganization of OSD T&E, JMEMs have become part of my responsibility. I am beginning a special task to encourage this connection between OT, JMEMs, and models.

Test design and planning, as well as logistic and reliability analyses, were highlighted in the DSB report as areas that would benefit significantly if M&S were applied to them. They note, however, that the lack of up-front funding is often the critical problem for M&S. The DSB recommended more funding of M&S in the programs.

An example of the benefits of using modeling in conjunction with field tests is included in the system report on the Longbow Hellfire Missile in the main body of this report.

DSB recommendation 4. Test and Evaluation Facilities Need Reengineering.

The Defense Science Board recommended periodic reengineering efforts at the major ranges: "Facilities within the DoD's Major Range and Test Facilities Base should be required to conduct periodic, systematic reviews to determine where data acquisition, reduction, and analysis procedures could be improved to increase the efficiency of the T&E process."

My office will look into these matters for their own merit, and because Section 913 of the Fiscal Year 2000 Defense Authorization Act requires an examination of consolidating test and evaluation responsibilities by area or function or by designating lead agencies/executive agents.

In summary, the DSB report would require significant modification to the regulations or practices governing acquisition, the organizations responsible for executing tests, and the methods used to test. Regulations or practices will need to change to require: (1) tester input to the requirements formulation process (both MNS and ORD); (2) a preliminary TEMP at MNS; (3) an Operational Assessment at Milestone I; (4) participation of the Service OTAs in all ACTDs (this is currently only an option); and (5) better software testing. Finally, the DSB recommended that new methods be adopted for use of M&S in T&E; and that there be more interaction with system developers and contractors in Combined Acquisition Forces.

DSB work in T&E will not stop with this report. Congress has directed a new DSB study discussed below under Future Studies.

National Research Council (NRC) Report.

In May 1998, the National Research Council published the first results of their four-year study on testing and defense acquisition. From their fundamental observation that "many design flaws in both industrial and defense systems become apparent only in operational use" and their observations on the value of involving operational testing early in an acquisition program, the NRC recommended a new approach:

Congress and the Department of Defense should broaden the objective of operational testing to improve its contribution to the defense process. The primary mandate of the Director, Operational Test and Evaluation should be to integrate operational testing into the overall system development process to provide as much information as possible, as soon as possible, on operational effectiveness and suitability. In this way, improvements to system design and decisions about continuing system development or passing to full-rate production can be made in a timely and cost-effective manner.

Other recommendations by the NRC panel included: (1) establishing a role for operational test personnel in the process of defining verifiable, quantifiable, and meaningful operational requirements; (2) giving DOT&E access to funds to augment operational tests when needed; (3) having OPEVALs address a system's overall performance and its ability to meet mission goals; (4) establishing a centralized testing and operational evaluation data archive; and (5) increasing the use of early, small-scale operational testing.

This year the NRC published a volume of three backup papers. It included a game theoretic proof of the advantage of oversight (supervision) of the test design stage rather than the test reporting process, a separate suggestion for improved analysis of reliability testing results, and an outline of better software testing procedures. The first of these papers is consistent with DOT&E and the Service OTAs becoming involved early in the testing programs of major and non-major systems. In the other two specific areas, I have begun an effort with the NRC to bring together experts to improve the practice of T&E in reliability testing and software testing. OSD Program Analysis & Evaluation and USD(A,T&L) are co-sponsors.

Future Studies.

General Accounting Office (GAO) Studies of Best Practices

The GAO recently completed a study of best practices in acquisition, with which the Department agreed in large measure. They are now undertaking a review of T&E best practices.

DSB Study at Congressional Direction

Section 913 of the Fiscal Year 2000 Defense Authorization Act directs the Secretary of Defense to convene a panel of independent experts, under the auspices of the Defense Science Board, to conduct an analysis of the resources and capabilities of the Department of Defense, including those of the military departments. This study will identify opportunities to achieve efficiencies, reduce duplication, and consolidate responsibilities in order to have a national T&E capability that meets the challenge of Joint Vision 2010 and beyond. A key part of that challenge will be suggesting the investment strategy for the items discussed under testing instrumentation. Planning for the study has begun and its findings are due no later than August 2000.

 

IV. RESOURCES FOR THE OFFICE OF THE DIRECTOR, OPERATIONAL TEST AND EVALUATION

In FY99, my office was funded with $15.311 million for operational testing and $18.934 million for live fire testing. Of the $18.934 million for live fire testing, $5 million was added by Congress for "Testing and Training," and $4 million was added for "Radio Frequency Assessments." These topics of special congressional interest are discussed in detail in the Live Fire section of this report.

In October 1999, DOT&E responded to an audit of the DoD Inspector General (IG). In our response, our own study confirmed the IG conclusion that DOT&E "lacked adequate resources to monitor, review, and report all DoD operational testing." However, through prioritization of our efforts, we have emphasized the programs where the most attention was warranted. As discussed below, it must be acknowledged that the testing of non-major systems has become a major problem.

 

V. LIVE FIRE TESTING AND EVALUATION

Fiscal Year 1999 marked the fifth year since the Federal Acquisition Streamlining Act mandated that the Live Fire Test and Evaluation (LFT&E) Program should become an integral part of our mission. The integration of the LFT&E Program into our mission has enabled us to take a more balanced look at weapon effectiveness, suitability, and survivability. LFT&E is primarily driven by the physics of the failure mechanisms and OT&E is heavily driven by the tactics and doctrine practiced. Working together, their sum is much greater than their parts taken separately.

My Live Fire Directorate has been very active in making the best use of modeling and simulation in T&E. The Live Fire Test program, perhaps more than any other testing activity, continues to add discipline to the exercise and evaluation of modeling and simulation in support of acquisition. Live Fire Test policy requires a model prediction to be made prior to every Live Fire Test. This policy has focused attention on test instrumentation, test issues, shot sequencing, as well as model adequacy.

Also during FY99, the OSD-chartered Joint Live Fire Program, which addresses the survivability of DoD fielded equipment (in contrast to the LFT&E program which evaluates systems not yet fielded) initiated a major thrust to evaluate the effects of Manportable Air Defense Systems (MANPADS) against aircraft, both ours and threat aircraft. This effort will also have significant impact on the commercial aviation sector. One of the leading causes of loss of life in commercial aviation worldwide has been from MANPADS attacks, with over 30 aircraft lost.

Congress provided additional funding to the Live Fire Test program to initiate the assessment of potential vulnerability effects from radio frequency devices of various origins, complexities and configurations on military and commercial-off-the-shelf systems. Numerous proposals were received during the year and several projects will be completed over the coming year to address these emerging threat issues.

The Live Fire Testing and Training Program continued to grow and see major successes. In fact, the Combat Trauma Patient Simulator project was so successful that Congress spun it off and is now supporting it as a separate line item. Congress provided additional funds to expand the testing and training program due to such early successes. In FY00, the testing and training program has again grown with congressional support to its highest level since the program began.

In July 1998, I proposed a forum to discuss and consider clarifying LFT&E policy. Together with the Army, Navy, and Air Force Test and Evaluation executives we drafted new regulations under existent Live Fire Test legislation, Title 10, Section 2366, and reached consensus in late 1998. This draft has been forwarded to the Defense Acquisition Policy Working Group and we are awaiting its incorporation into appropriate regulations.

 

VI. NON-MAJOR SYSTEMS HAVE RECEIVED GUIDANCE TO PROMOTE OT&E INVOLVEMENT

Section 139 (b)(3), Title 10, U.S. Code states "The Director shall monitor and review all operational test and evaluation in the Department of Defense." As provided the first time last year, a section of this report summarizes the most important OT&E activities conducted by the Services on non-major weapon systems as they supported full-rate production acquisition decisions. Unfortunately, in many cases budget pressures on the OTAs have caused them to drop their participation in non-major systems.

The difficulty of encouraging early Service OTA involvement in the test and evaluation of non-oversight systems has come to the attention of USD(A,T&L). He has issued policy guidance to the Service Acquisition Executives and Service Vice Chiefs of Staff that they not consider the acquisition strategy or operational requirements development complete until the OTA has coordinated on it. The goal of this guidance is to promote early involvement, budget adequate resources for T&E, and establish the adequacy of planned operational testing to military requirements.

 

VII. INDUSTRIAL COMMITTEE ON OPERATIONAL TEST AND EVALUATION

During FY99, the Industrial Committee on Operational Test and Evaluation was formally chartered under the aegis of the National Defense Industrial Association for the purpose of fostering clear communication between the private defense industrial sector and our office. From the outset the committee has been effective in resolving issues regarding the lack of accessibility by some industries to various vital T&E-related documents such as TEMPS, ORDs, and other similar supporting documentation, and has supported changes in policy and practice, resulting in clearer understanding of user requirements.

 

VIII. MAJOR REPORTS

In FY99 and to date in 2000, there have been fourteen formal reports on the OT&E and LFT&E of weapons systems for submittal to Congress. These reports are bound separately and available on request as an annex to the classified version of this Annual Report.

Several of these were B-LRIP OT&E reports, namely Combat Service Support Control System, Close Combat Tactical Trainer, B-1B Block D Conventional Mission Upgrade Program, Cheyenne Mountain Upgrade, ALR-67(3) Advanced Special Receiver, Secure Mobile Anti-Jam Reliable Tactical-Terminal, Fighter Data Link, Voice Communications Switching System, and the Minuteman III Guidance Replacement Program. Three of the reports were Live Fire Test reports (the B-1B Conventional Mission Upgrade Program, the Army's Hand-Emplaced Wide Area Munition, and the SH-60B and HH-60H Armed Helo). The Joint Standoff Weapon and Rolling Airframe Missile weapon required a combined B-LRIP OT&E report and Live Fire Test report.

This Annual Report responds to statutory requirements. No waivers were granted to subsection 2399 (e)(1), Title 10, United States Code, pursuant to subsection 2399 (e)(2). Members of my staff and I will be pleased to provide additional information.



Philip E. Coyle
Director


Return to FY 99 Table of Contents



NEWSLETTER
Join the GlobalSecurity.org mailing list


 



NEWSLETTER
Join the GlobalSecurity.org mailing list


 
Share
Additional Info


Find a Security Clearance Job!


Find a Security Clearance Job!


Find a Security Clearance Job!