UNITED24 - Make a charitable donation in support of Ukraine!

Military

FY01 Annual Report

DIRECTOR'S INTRODUCTION

I had the privilege of being nominated in May 2001 by the President to be the Director of Operational Test and Evaluation. I was subsequently confirmed by the Senate and sworn in as the DOT&E in July 2001. During my confirmation hearing before the Senate Armed Services Committee, Senator Levin emphasized to me that testing needed to be independent, fair, and reliable. Senator Warner made clear that testing was important at every level-from weapons to clothing-and that we did not need another M-16 type problem. Thus, I began my tenure with a strong reminder that, for all systems, the quality of testing and information obtained from testing is a major concern to the Congress as well as to the Department of Defense. Events since September 11 have once again confirmed the importance of fielding effective and suitable weapon systems. There simply may not be enough time to "get it right" after a weapon system is provided to our soldiers, sailors, airmen, and marines.

I have committed to provide the Secretary of Defense and other senior decision makers in the Department with objective assessments, on a continuous and timely basis, of our weapon systems undergoing test and evaluation, and to work diligently to implement the recommendations of the December 2000 Defense Science Board (DSB) report on test and evaluation. In that context, I am committed to having the test and evaluation community focus its test programs on military missions, accomplishment of those missions, and total life-cycle suitability. Finally, I am committed to ensuring that reliable, effective, and safe weapons are delivered to our dedicated combat forces through robust testing.

As DOT&E, I will do everything in my power to address these concerns and keep these commitments. The guidance I have provided my staff, and shared with the Service Operational Test Agencies, is to strive to achieve the following objectives:

  1. Rigorous, robust testing that is adequate by any standard, focused on military missions and mission accomplishment and total life-cycle suitability. One principal responsibility will be to provide timely and objective information on a system's demonstrated capability-to-date to decision makers, as opposed to judging only the overall effectiveness/suitability of a system in a pass/fail mode relative to some set of requirements at the end of the development. In order to accomplish this, we will need quality evaluation plans, complete and clear Test and Evaluation Master Plans (TEMPs), production representative systems for OT&E, and no waivers or deferrals to the completeness of evaluations.
  2. A test infrastructure that can execute adequate testing in time to meet a program's reasonable schedule. "Testing takes too much time," is often an excuse used to cut back on testing. The T&E infrastructure must be systematically improved to support reasonable schedules.
  3. "Tell-it-like-it-is" reports that are complete, accurate, objective, and timely to support programmatic decisions, accomplished in a mission context as opposed to specification compliance.


QUALITY OF TESTING

The December 2000 Defense Science Board Report noted, "The systems below Acquisition Category (ACAT) I in the priority system are being fielded without adequate testing. Even for the ACAT I programs there is growing evidence that testing is not being done adequately." This is in line with Senator Warner's concern with the quality of testing at all levels. I am, quite frankly, raising the bar on what will be considered adequate test and evaluation while, at the same time, striving to get systems that are effective and suitable into the hands of our forces faster. For the last two years, our Annual Reports have noted the problems of testing non-major systems (i.e., below ACAT I), which are not under OSD oversight. Reports from the Operational Test Agencies this year indicate little has changed since the DSB assessment of a year ago.

My initial efforts to increase the quality of testing have focused on ensuring that operational tests are complete enough to provide data adequate for the evaluation.

One feature of current practice I seek to change is the Service's ability to waive tests without DOT&E review and approval. The Defense Science Board strongly recommended that Secretary of the Navy Instruction 5000.2B be modified to rule out waivers as a unilateral action by the Service. The current policy allows waivers from criteria for certification of readiness for operational test (such as completion of the system safety program) and waivers for deviation from testing requirements directed by the Test and Evaluation Master Plan.

I have begun a dialogue with the Service to modify this regulation. The modification, I believe, should meet two objectives: first, to treat TEMP requirements as a contract between the Service and OSD that can be changed only by mutual consent, and second, to ensure that the decision maker, at whatever level, has the benefit of knowing that all important information, including waived parameters, was used in resolving critical operational issues.

A second area where effort is needed to improve the quality of testing is software testing. Reviews of Department testing have repeatedly criticized the quality of software testing. This year, in a continuing series of external reviews sponsored by my office and the Under Secretary of Defense for Acquisition, Technology and Logistics (USD(AT&L)), the National Academy of Science held a workshop on Software Reliability Testing. This workshop brought in experts from industry, academia, and defense to review how DoD does software development and testing. The National Academy concluded that the Department is not using the best test practices available. Practices such as use-based testing are known in industry to be faster in identifying problems and, therefore, to save cost and time. The consensus of the workshop participants was that the Department should be more proactive in using the methods that increase the tempo at which we gain information from testing. In cooperation with the USD(AT&L), we have begun to explore designating a number of pilot programs in this area.

Although I feel strongly that the quality of testing needs to improve, I also believe that we can get weapons that are effective and suitable into the hands of our forces faster by increasing the tempo of testing. In this area, we have worked with USD(AT&L) to accelerate the testing and fielding of systems that might be especially relevant to the campaign against global terrorism. We are evaluating a number of programs where testing could be accelerated, and where enough is already known to ensure that the system in its present state would increase our capability. None of this is business as usual. This type of accelerated test activity and up-to-the-minute evaluation will require greater flexibility from ranges and test organizations.


INFRASTRUCTURE: FACILITIES, PEOPLE, AND PROCESSES

FACILITIES

In the long run, increasing the tempo of testing will require a shift in our current practices for funding and managing test facilities and ranges. The current financial system has evolved away from the guidance provided when the Major Range and Test Facilities Base was established. At the present time, defense programs must bear both the cost of their tests and the overhead costs to maintain the ranges. This has proven to be a disincentive to testing. The cost to program managers has risen sharply over the past decade as they take on the overhead costs of the test ranges; as a result, program managers seek to minimize the amount (and therefore the cost) of testing. As they succeed, their success forces the price even higher for each test. Operational Testing sees the results of inadequate development testing at the end of the development phase. The one Service that has presented information on this to us indicates that more systems are failing to meet suitability requirements now than when we had a different kind of test funding. At present the reimbursable rate is about 75 percent for the Army, 63 percent for the Air Force, and 67 percent for the Navy. Unfortunately, these data have uncertainty associated with them that makes comparison difficult because there is no common financial management system among the Services.

After reviewing the Department's T&E capability, the Defense Science Board task force recommended in December 2001 that DoD create a DoD T&E Resource Enterprise under the Office of the Director of Operational Test and Evaluation to fund and manage the DoD T&E organizations, work force, and infrastructure.

On June 28, 2001, the Secretary of Defense noted in testimony on the 2002 Defense Department amended budget that, "We need to get on a path to correct the most serious deficiencies; we need to stabilize the force and begin modernization; we need to restore DoD infrastructure.." Our re-capitalization of the testing infrastructure technical facilities is on the order of 70 years. Commercial best practice is about 17 years. It would be a major task to create and defend the budgets needed to fix the infrastructure.

The SASC Report states "The committee directs the Department to develop a budget plan and schedule for the implementation of the Defense Science Board's recommendations, to be submitted along with the fiscal year 2003 budget request." Creating a DoD T&E Resource Enterprise, as recommended by the DSB, would be a major, controversial step. After review of plans to implement this proposal, the Department chose not to go forward with this recommendation in the FY03 budget.

PEOPLE

Infrastructure is not limited to facilities, but also includes people and processes. The DSB Task Force "learned that the issue of human resources - how to attract and retain personnel with the motivation and skill to serve and lead in civilian and military capacities - is one of the most significant concerns of the T&E community."

The demographics of T&E show that a large fraction of its community will soon be eligible to retire. Further, the downsizing over the last ten years has all but precluded the recruiting of new talent. As a result, the relationships established by our T&E community over the years with universities and the hiring of graduates with skills in new research areas have suffered. We have begun to address this issue with a new effort called "Test and Evaluation/Science and Technology." This program element addresses two concerns: first, bringing leading edge technologies to the T&E business; and second, establishing links between the test ranges and universities where much of that research is done.

A second aspect of recruiting of new talent is the ability to offer a wage competitive with industries in these high technology areas. Pay-banding has proven to be an effective way to attract and keep (for some time) new graduate talent. Unfortunately, only a few Service test ranges have implemented this option. I would recommend modifications to the pay grade system to allow the pay-banding option in hiring for all the ranges.

PROCESSES

A common financial management system is an essential first step to improved management of our nation's test ranges. The present set of individual range accounting systems, many of which do not meet current DoD accounting system standards, simply does not provide the type of information needed for efficient management.


REPORTING TEST RESULTS

Congressional interest has, this year, resulted in legislation on communication of safety concerns from operational test and evaluation officials to program managers. The National Defense Authorization Act for Fiscal Year 2002, Section 263 states in part, "The Director shall ensure that safety concerns developed during the operational test and evaluation of a weapon system under a major defense acquisition program are communicated in a timely manner to the program manager for that program and for consideration in the acquisition decision making process." With respect to this congressional concern, I have issued the following policy:

  • The responsible test organization shall, to the maximum extent practicable, release valid test data and factual information in as near real-time as possible to all parties with a need to know. Data may be preliminary and should be recognized as such.
  • To protect the integrity of the OTA evaluation process, release of evaluation results may be withheld until the final report, according to each OTA's established policies.
  • This policy is effective immediately and will be incorporated in the next revision of DoD Regulation 5000.2-R.


CONCLUSION

We face many challenges in our commitment to improve the quality of testing, the infrastructure to support testing, and the reporting of test results. Ensuring that testing is rigorous and robust by any standard, providing an infrastructure that executes adequate testing in time to meet a program's schedule, and submitting "tell-it-like-it-is" reports are the necessary steps to bringing about those improvements. I am honored to have the opportunity to work toward these ends.


MAJOR REPORTS

In FY01 through February 14, 2002, there have been 12 formal reports on the OT&E and LFT&E of weapons systems for submittal to Congress. These reports are bound separately and available on request. The reports are: Predator UAV OT&E, Minuteman III PRP, M2A3 Bradley Fighting Vehicle System OT&E/LFT&E, 2000 Pound JDAM OT&E, B-2 LFT&E, USS Osprey (MHC 51) Class Coastal Mine Hunter OT&E, MH-47E and MH-60K SOA LFT&E, AN/ALQ-135 Internal Countermeasure Set for the F-15E Strike Eagle OT&E, V-22 Osprey OT&E and LFT&E, XM1001 40mm Canister Cartridge LFT&E, Joint Primary Aircraft Training System OT&E, and Cooperative Engagement Capability OT&E.

This annual report responds to statutory requirements. No waivers were granted to subsection 2399 (e)(1), Title 10, United States Code, pursuant to subsection 2399 (e)(2). Members of my staff and I will be happy to provide additional information as appropriate.


Thomas P. Christie
Director
Return to FY 01 Table of Contents



NEWSLETTER
Join the GlobalSecurity.org mailing list