[House Hearing, 111 Congress]
[From the U.S. Government Printing Office]
[H.A.S.C. No. 111-36]
MEASURING VALUE AND EFFICIENCY:
HOW TO ASSESS THE PERFORMANCE
OF THE DEFENSE ACQUISITION SYSTEM
__________
HEARING
BEFORE THE
PANEL ON DEFENSE ACQUISITION REFORM
OF THE
COMMITTEE ON ARMED SERVICES
HOUSE OF REPRESENTATIVES
ONE HUNDRED ELEVENTH CONGRESS
FIRST SESSION
__________
HEARING HELD
APRIL 1, 2009
[GRAPHIC] [TIFF OMITTED] TONGRESS.#13
U.S. GOVERNMENT PRINTING OFFICE
51-761 WASHINGTON : 2010
-----------------------------------------------------------------------
For sale by the Superintendent of Documents, U.S. Government Printing
Office, http://bookstore.gpo.gov. For more information, contact the
GPO Customer Contact Center, U.S. Government Printing Office.
Phone 202-512-1800, or 866-512-1800 (toll-free). E-mail, gpo@custhelp.com.
PANEL ON DEFENSE ACQUISITION REFORM
ROBERT ANDREWS, New Jersey, Chairman
JIM COOPER, Tennessee K. MICHAEL CONAWAY, Texas
BRAD ELLSWORTH, Indiana DUNCAN HUNTER, California
JOE SESTAK, Pennsylvania MIKE COFFMAN, Colorado
Andrew Hunter, Professional Staff Member
Jenness Simler, Professional Staff Member
Alicia Haley, Staff Assistant
C O N T E N T S
----------
CHRONOLOGICAL LIST OF HEARINGS
2009
Page
Hearing:
Wednesday, April 1, 2009, Measuring Value and Efficiency: How to
Assess the Performance of the Defense Acquisition System....... 1
Appendix:
Wednesday, April 1, 2009......................................... 29
----------
WEDNESDAY, APRIL 1, 2009
MEASURING VALUE AND EFFICIENCY: HOW TO ASSESS THE PERFORMANCE OF THE
DEFENSE ACQUISITION SYSTEM
STATEMENTS PRESENTED BY MEMBERS OF CONGRESS
Andrews, Hon. Robert, a Representative from New Jersey, Chairman,
Panel on Defense Acquisition Reform............................ 1
Conaway, Hon. K. Michael, a Representative from Texas, Ranking
Member, Panel on Defense Acquisition Reform.................... 3
WITNESSES
Ahern, David G., Director of Portfolio Systems Acquisition,
Office of the Under Secretary of Defense for Acquisition,
Technology and Logistics....................................... 6
Sullivan, Michael J., Director for Acquisition and Sourcing
Management, U.S. Government Accountability Office.............. 8
APPENDIX
Prepared Statements:
Ahern, David G............................................... 38
Andrews, Hon. Robert......................................... 33
Conaway, Hon. K. Michael..................................... 34
Sullivan, Michael J.......................................... 51
Documents Submitted for the Record:
[There were no Documents submitted.]
Witness Responses to Questions Asked During the Hearing:
Mr. Andrews.................................................. 77
Mr. Conaway.................................................. 78
Mr. Sestak................................................... 80
Questions Submitted by Members Post Hearing:
[There were no Questions submitted post hearing.]
MEASURING VALUE AND EFFICIENCY: HOW TO ASSESS THE PERFORMANCE OF THE
DEFENSE ACQUISITION SYSTEM
----------
House of Representatives,
Committee on Armed Services,
Panel on Defense Acquisition Reform,
Washington, DC, Wednesday, April 1, 2009.
The panel met, pursuant to call, at 7:33 a.m., in room
2212, Rayburn House Office Building, Hon. Robert Andrews
(chairman of the panel) presiding.
OPENING STATEMENT OF HON. ROBERT ANDREWS, A REPRESENTATIVE FROM
NEW JERSEY, CHAIRMAN, PANEL ON DEFENSE ACQUISITION REFORM
Mr. Andrews. Ladies and gentlemen, good morning. The panel
will come to order.
I am informed that we expected our ranking member, Mr.
Conaway, will be present shortly. But because one of our
minority colleagues is present, we are going to begin.
First of all, I appreciate the indulgence of the witnesses
and my colleagues and our staff in being here at such an early
hour. I hope that we did not inconvenience people terribly too
much.
The reasoning behind this is that this is very substantive
and important material. And we want the members to be able to
have an uninterrupted block of time to really hear what the
witnesses say, to engage in what I hope would be constructive
dialogue with the witnesses, and not be caught up in our normal
time pattern around here, which is the bell ringing to vote and
conflicting with other hearings and meetings.
We really want to give our utmost and most serious
attention to the material, so that is the reason for this early
beginning. And I appreciate the indulgence of the members of
the panel.
This morning we are setting out to try to answer the
question, what is the fair measure, what is the fair way of
measuring the difference between--the difference, if any--
between the cost paid by the taxpayers to acquire goods and
services in the Department of Defense (DOD) and the value we
are receiving? Is there a difference between those two
concepts? I think there is. And if there is, what is a fair way
of measuring that difference?
We have two outstanding witnesses this morning who can
speak with great authority to that question. One, Mr. Sullivan,
in his work at the Government Accountability Office (GAO), has
frankly already given us a compelling measure of the answer to
that question with respect to major weapons systems. He is
going to talk this morning about the most recent work the GAO
has done.
And here is essentially what it says. In 2003, we had 77
major weapons systems that were subject to this kind of
evaluation. We now have 96.
In 1977, the average cost differential--the increase in
programs over their original base line--that is a very
important concept, the original base line versus the adjusted
one. But their cost increase over their original base line in
2003 was 19 percent in the aggregate. That number has gone the
wrong way since 2003. It has now gone up to 25 percent in the
most recent data that the GAO has presented.
What is interesting about that 25 percent, it happens to be
one of the triggers in the Nunn-McCurdy legislation that was
passed quite a few years ago now, it is one of the triggers
that triggers a very intense level of scrutiny of a major
weapons system. So it is kind of discouraging to think that,
looked at in the aggregate across these 96 weapons systems, in
the aggregate, they now all trigger this kind of more intense
assessment.
Now, these data, as all good work does, really beg a
different set of questions. Why?
And in looking at the testimony this morning, and from
hearing witnesses, I am sure we are going to hear that there is
a lot more to this than meets the eye.
The superficial response to this would be to say, ``Well,
geez. The people who are building these systems and managing
them must be doing a really terrible job.'' That is not
necessarily so.
And I think what we are going to hear this morning is, if
you go deeper into the process, you find two other questions
that have to be looked at.
The first is how good or bad of a job we are doing at
conceiving these systems in the first place. When there is a
need identified, and there is a weapon system identified to
fill that need, are we following the right process to determine
what should fill the need?
What in the jargon is called an AOA, which is an
independent Analysis of Alternatives, how well or poorly are we
doing that? Because if we do a good job with the analysis of
alternatives, we presumably go down the right path to fill the
need and provide the capabilities that the service members
need.
So, one of the questions I think lies below the disturbing
data with which we start this morning is: How effective is that
AOA system?
And then the second goes to the question of how accurate
the original base line is, or how inaccurate the original base
line is. It is certainly not fair to blame those who are
implementing a weapons systems, if the standards against which
they are being measured were unrealistic and flawed in the
first place. That may or may not be the case, but it is
something else, again, I think that we are going to hear about.
There is significant evidence to show that the huge
adjustments from the original base line to the modified ones
may not be a measure of a lack of aptitude by those
implementing the systems. It may be a measurement of lack of
accuracy by those establishing the original base lines.
So, the standard against which we are measured is a very
important question. And frankly, it appears like we do not have
the tools to answer that question particularly well.
The final point that I want to make is that we also want to
go beyond this morning's discussion. This morning's discussion,
by necessity, focuses on major weapons systems, major weapons
acquisition.
But as we heard last week in the briefing, a significant
percentage--at least half--of the procurement done under the
Department of Defense is not major hardware systems, it is
services. And we want to be sure that we are in a position to
take a comprehensive look at those issues as they come along,
as well.
I am glad that my friend and copilot has arrived. And if he
has had a chance to catch his breath, I would be happy to yield
to him and ask him for any introductory comments.
Good morning, Mike.
[The prepared statement of Mr. Andrews can be found in the
Appendix on page 33.]
STATEMENT OF HON. K. MICHAEL CONAWAY, A REPRESENTATIVE FROM
TEXAS, RANKING MEMBER, PANEL ON DEFENSE ACQUISITION REFORM
Mr. Conaway. All right. Thank you, sir. My apologies. The
doors down in front were not open, and the police had a long
rollcall as their excuse.
Mr. Andrews. So much for bipartisanship. Now we lock the
doors on these guys. [Laughter.]
But we locked a democrat out, too? Okay. All right.
[Laughter.]
Mr. Conaway. Yes, he went the other way.
Good morning, Mr. Chairman, ladies and gentlemen.
Mr. Andrews. Good morning, Mike.
Mr. Conaway. I think it is appropriate for the panel's
first hearing that we have a senior member of the Defense
Acquisition System and GAO's senior acquisition management
professionals sitting side by side. It is not that often that
members of this committee get a chance to talk to the
Department of Defense and GAO at the same time.
Thank you, gentlemen, for making this possible and agreeing
at such an early hour.
The first question this panel identified as part of the
work plan was whether there is a method to reasonably measure
the ability of the defense acquisition system to deliver the
goods and services needed by the warfighter, and to do so in a
timely fashion, and to do so at a fair price to the taxpayer.
Today's hearing will likely not answer the larger
philosophical question about how one should measure value in
defense acquisition, but is an important first step for us to
understand how DOD and GAO currently assess performance in one
segment of defense acquisition, the major weapons systems
programs, that were the focus of GAO's assessment released this
week.
These programs receive a great deal of scrutiny by Congress
and by the media for good reason. GAO's report reveals that
nearly 70 percent of DOD's 96 largest weapon programs were over
budget $296 billion, or 42 percent. This is simply
unacceptable.
Everyone understands why we cannot continue to tolerate
these cost increases. There is little more to be said on that
subject.
But what we do not hear as much about is that the GAO had
encouraging words to say about the steps the Pentagon has taken
to improve acquisition outcomes, including early stage systems
engineering, prototyping, measurable yearly plans, increasing
accountability and minimizing requirements creep.
The report states, these changes are consistent with a
knowledge-based approach to weapons development that we have
recommended in our work. If implemented, these changes can help
programs to reduce risk with knowledge, thereby increasing the
chances of developing weapons systems within cost schedule
targets, while meeting user needs.
These are encouraging signs. But to improve outcomes on the
whole, DOD must ensure that these policy changes are
consistently implemented and reflected in decisions on
individual programs.
I hope we hear more today about these positive improvements
that DOD is making and what more needs to be done. Of course,
we are likely to learn that much of what DOD does to measure
performance is already statutorily required.
I also hope our witnesses feel free to share their views on
laws and regulations that are not assisting in their efforts to
obtain the best value and capability for our warfighters. There
is a balance to be struck between setting high expectations and
over-regulating the system.
With that I conclude, and again thank my fellow members.
And Mr. Chairman, I look forward to the witnesses'
testimony.
[The prepared statement of Mr. Conaway can be found in the
Appendix on page 34.]
Mr. Andrews. Thank you, Michael, very much.
And just echoing my friend's opening statement, when he
talks about the $296 billion in overruns, it is interesting
from Mr. Sullivan's testimony we are going to hear that the
weapons systems I made reference to have a total projected cost
of $1.6 trillion. And half of that money is yet to be expended.
So, to put that in some context, if the 25 percent overrun
that the GAO now reports had not been the case, the 25 percent
of that $800 billion is $200 billion. You know, the deficit
this year is about $1.2 trillion. It is a sixth of the deficit
that we are talking about, just from these systems expressed in
one-year terms. So, it is a lot of money and is of great
significance.
Without objection, any opening statements from other panel
members will be included in the record.
I want to first go through the biographies of the
witnesses.
And we are going to ask the witnesses--without objection,
we have entered your written testimony into the record of the
panel--to summarize your testimony in about five minutes. We
are not going to rigidly adhere to that rule this morning, but
we would like you to try to summarize your testimony in about
five minutes, and then we will proceed to questions from the
members.
David G. Ahern is the Director of Portfolio Systems
Acquisition. He is responsible for providing portfolio
management, technical and programmatic evaluation and
functional oversight. His office sustains Department of Defense
strategic and tactical programs in support of the Under
Secretary of Defense for Acquisition, Technology and Logistics,
and the Deputy Under Secretary of Defense for Acquisition and
Technology.
Mr. Ahern was previously professor of program management
and Director of the Center for Program Management at the
Defense Acquisition University (DAU) at Fort Belvoir, Virginia.
While at DAU, Mr. Ahern also served as an executive course
learning team mentor and instructor at the Defense System
Management College, School of Program Management.
Mr. Ahern has also held business development, program
management and business unit positions in the development of
tactical information systems with General Dynamics Information
Systems Company and the Northrop Grumman Electronic Systems
sector.
A native of Connecticut--one of the Final Four
participants--Mr. Ahern was a career naval officer and is a
graduate of the Naval Academy in Annapolis. He is also a
graduate of the Naval Postgraduate School and Defense Systems
Management College. Mr. Ahern's sea duty was as a naval aviator
in the RA-5C Vigilante during multiple deployments in the
Pacific and Atlantic, and as an Executive and Commanding
Officer of Tactical Electronic Warfare Squadron 33.
Ashore, he was head, Tactical Command and Control Branch on
the staff of the Chief of Naval Operations, project officer of
the Navy Space Project, Class 2 Program Manager at the Joint
Tactical Information Distribution System (JTIDS) Program
Office, Program Manager, Navy Tactical Data Link Systems, and
Deputy of the Program Executive, Office Space, Communications
and Sensors.
Mr. Ahern, thank you, and it is great to have you with us
this morning.
Mike Sullivan--no stranger to this committee--serves as
Director, Acquisition and Sourcing Management at the U.S.
Government Accountability Office. This group has responsibility
for examining the effectiveness of agency acquisition and
procurement practices and meeting their mission performance
objectives and requirements.
In addition to directing reviews of major weapons system
acquisitions, Mr. Sullivan has developed and directs a body of
work examining how the DOD can apply best commercial practices
to the nation's largest and most technically advanced weapons
systems.
This work has spanned a broad range of issues critical to
the success in the delivery of systems, including quality
assurance, transition to production, technology inclusion,
requirement setting, design and manufacturing, reducing total
ownership cost, software management and affordability. His team
also provides the Congress with early warning on technical and
management challenges facing these investments.
Mr. Sullivan has been with the GAO for 23 years. He
received a bachelor's degree in political science from Indiana
University and a master's degree in public administration from
the School of Public and Environmental Affairs in Indiana
University. Mr. Sullivan is married and has two children.
Welcome, gentlemen. We are really happy to have you with
us.
And Mr. Ahern, we will start with your testimony.
STATEMENT OF DAVID G. AHERN, DIRECTOR OF PORTFOLIO SYSTEMS
ACQUISITION, OFFICE OF THE UNDER SECRETARY OF DEFENSE FOR
ACQUISITION, TECHNOLOGY AND LOGISTICS
Mr. Ahern. Thank you, and good morning, Chairman Andrews,
Ranking Member Conaway, distinguished members of the panel.
Thank you for the opportunity to appear before you today to
discuss how the department values the acquisition programs and
assesses the effectiveness of ongoing developments in
procurements. I will be brief in order to move quickly to the
panel's questions.
In December of 2008, the department issued a new version of
the DOD Instruction 5000.2, Operation of the Defense
Acquisition System. This instruction established policies and
procedures for all of the department's acquisition programs.
It provides for a structured, disciplined process and
incorporates many initiatives aimed at improving not only the
defense acquisition system as a whole, but also execution of
individual programs. I would like to take a few minutes to
highlight a few of the initiatives.
First, the department has established a mandatory--a
mandatory--materiel development decision (MDD) review that
represents the formal entry point into the acquisition system.
Every program will go through that MDD. At the Materiel
Development Decision, the preliminary concept of operations, a
description of the needed capability, the operational risk and
the basis for determining that a non-materiel approach will not
sufficiently mitigate the capability gap are thoroughly
discussed.
Also discussed is study guidance for the analysis of
alternatives, which, when completed, will have examined the
full spectrum of alternatives, starting with the current
capability and moving to an entirely new materiel solution,
with a goal of balancing capability needs with what the
department can effectively acquire and afford to achieve the
best value proposition for our nation.
A second major change to the DOD 5000 is a revamp
technology development phase. In the technology development
phase the department seeks to reduce technology risk, determine
the mature technologies to be integrated into a full system and
demonstrate critical technology elements on prototypes.
Competitive prototypes, whether at the system or component
level, reduce technical risk, validate designs and should
improve cost estimates. They will also enable the evaluation of
manufacturing processes and, of course, refine requirements--
again, with a goal of ensuring the acquisition enterprise
pursues the best value solution to meet warfighter needs.
Taken together, the Materiel Development Decision, a
materiel solution analysis and a technology development phase
with competitive prototyping help to define the best value
acquisition program to meet the warfighter needs.
We then execute the selected alternative in the Engineering
and Manufacturing Development (EMD) phase, using additional
tools to keep the program on track.
The Acquisition Program Baseline is the key document for
program management. It reflects the approved program being
executed. It describes the cost estimate, the schedule,
performance, supportability and other relevant factors for the
program.
That Acquisition Program Baseline is the way that we track
the progress of the program through the development in EMD and
on into procurement. An acquisition strategy describes how the
program manager plans to employ contract incentives to achieve
required cost, schedule and performance outcomes.
Technical reviews are another tool the department uses to
assess program status and for decision-making purposes. There
are Defense Acquisition Board reviews where the members advise
the Under Secretary of Defense on critical acquisition
decisions.
Further, there are configuration steering boards held by
the service acquisition executives that meet at least annually
to review all requirements, changes and any significant
technical and configuration changes in their programs that have
the potential to result in cost and schedule impacts. Such
changes will generally be rejected, deferring them to future
blocks or increments.
Program support reviews are means to inform the Milestone
Decision Authority and program office of the status of
technical planning and management processes by identifying
cost, schedule and performance risk and recommendations to
mitigate those risks.
Defense acquisition executive summaries review programs
monthly and provide an early warning report describing the
actual program problems, warnings of potential problems and
mitigation actions.
In addition to the program level tools described above, the
department also employs mechanisms to monitor contract-specific
performance. Earned value management is mandatory on cost and
fixed-price incentive contracts above a certain low threshold
value. It is a well-known tool used by both government and
industry program managers to measure contract performance
against a contract baseline, and it provides an early warning
for baseline deviations and a means to forecast final cost and
schedule on that contract.
I have only touched on a few elements in the new DOD 5000,
aimed at ensuring programs are started with a solid foundation,
are focused on disciplined execution and deliver capability to
the warfighter within cost and schedule parameters.
It will take time for us to fully realize the benefit of
these policy initiatives, and we will continue to look for
opportunities to further improve the defense acquisition
system.
I look forward to the opportunity to work with the members
of this panel on this critical task, and I am grateful to the
members of this committee for your support for the Defense
Department. Thank you.
[The prepared statement of Mr. Ahern can be found in the
Appendix on page 38.]
Mr. Andrews. Mr. Ahern, thank you for your service and your
excellent work throughout your career and your contribution
this morning. Thank you very much.
Mr. Sullivan, welcome. Welcome back.
STATEMENT OF MICHAEL J. SULLIVAN, DIRECTOR FOR ACQUISITION AND
SOURCING MANAGEMENT, U.S. GOVERNMENT ACCOUNTABILITY OFFICE
Mr. Sullivan. Thank you, Mr. Chairman, Ranking Member
Conaway, other members of the committee. I am pleased to be
here this morning to discuss how best to measure the value DOD
is providing to the warfighter.
Earlier this week, we reported that a cumulative cost
growth in DOD's portfolio of 96 major defense acquisition
programs was $296 billion, and the average delay in delivering
promised capabilities to the warfighter was 22 months.
These outcomes mean that other critical national priorities
go unfunded, and warfighters go without the equipment they need
to counter ever-changing threats. This should be unacceptable.
A single metric or set of metrics is not enough to monitor
acquisitions and gain efficiencies. However, a cross-cutting
set of metrics that can measure knowledge, processes and
outcomes can be employed to improve acquisition outcomes.
We think about metrics and their value in the following
context.
First, we use knowledge metrics to determine how well
acquisition programs are managing and retiring predictable
technology, design and manufacturing risks by gaining knowledge
and retiring risk. These metrics are valuable, because they can
predict problems, and they can identify the causes of those
problems, so you can attack those causes.
Second, we use cost, schedule and capability metrics that
measure a program's health. These metrics have intrinsic value
as simple measurements, but they do little in the way of
diagnosing cause and effect. This is a way for managers and
decision-makers to keep an eye on the program.
Third, there are certain indicators that we look for that
are perhaps more important than the metrics, because they
determine the realism of the acquisition plans from the outset,
as the chairman was referring to in his opening statement.
These are a set of prerequisites for any program. And
without them, we question the value of any metric as you move
forward.
We know that the knowledge and program health metrics we
use to measure a program's progress and outcomes are valuable
when used in realistic, schedule-driven product development
environments. They are important indicators to decision-makers.
They work when they are measuring realistic plans and goals
that are supported by doable requirements, appropriate cost and
schedule estimates, and stable funding.
Our knowledge metrics identify potential problems that
could lead to cost and schedule shortfalls, and their likely
causes. They identify technology readiness levels very early,
measure design stability by about midway through a development
program and track whether critical manufacturing processes are
in control at the start of production.
They have predictive value. Generally, programs that do not
measure these risks at the right junctures will encounter a
cascade of problems beginning with design changes and
continuing with parts shortages, changes to the manufacturing
processes, labor inefficiencies on the manufacturing floor and
quality problems that will cost money. All of these things
delay programs and add to their costs.
Outcome metrics provide useful indicators about the health
of acquisition programs, and are valuable tools to improve
oversight. Last year, the Office of Management and Budget (OMB)
tasked DOD to work with us to develop a comprehensive set of
outcome metrics, to track program costs and schedule
performance and trends. We agreed to track trends and changes
across eight different cost and schedule data points, which are
in my written statement--I will not go through them here--from
each of the program's, from their original baseline, from a
five-year-out period and from a year ago. We do that on every
program.
These metrics give decision-makers, such as you, some
visibility into the direction an acquisition may be heading in
terms of cost and schedule.
We scale these outcome metrics up from the individual
programs to a portfolio level, to provide senior department
leaders and the Congress with a snapshot of the cumulative
impact of poor program performance on the relative health of
the overall portfolio and which way it trends.
For example, we know that the cost of the portfolio has
doubled since 2000. There are 19 more major acquisitions in the
portfolio.
Development cost, as the chairman referred to, has grown by
42 percent. And cost growth has forced the department to reduce
quantities on many programs. Programs are getting less for
their money, DOD's overall buying power is reduced and less
funding is available for other priorities.
Metrics by themselves cannot be valuable unless the
department does a better job ensuring that acquisitions start
with realistic baseline estimates for cost and schedule. I
think Mr. Ahern went through a lot of the initiatives in the
new policies that we think are encouraging in this regard.
We believe there is a set of prerequisites that must be a
part of any acquisition strategy before any measurement of an
acquisition's health can be valuable. Otherwise, metrics
measured unrealistic estimates will do no good.
Quickly, these prerequisites include: number one, setting
priorities by ensuring joint acquisitions more often and
validating only candidates that are truly needed and feasible;
number two, making a knowledge-based, evolutionary business
case for the product; number three, separating technology
development activities from product development activities,
which we think is really key, because if you get immature
technologies onto these product development programs, they
cause a lot of problems; number four, limiting the time and the
requirements for product development to a manageable level;
number five, employing systems engineering discipline early to
develop realistic cost and schedule estimates before product
development starts; and committing to fully funding development
programs once they are approved.
Mr. Chairman, I will stop there and conclude my statement.
I will be happy to answer any questions the committee may have.
[The prepared statement of Mr. Sullivan can be found in the
Appendix on page 51.]
Mr. Andrews. Well, thank you, gentlemen, both, for very
edifying and useful testimony. We appreciate the effort. And as
I say, your full statements have been entered into the record.
Mr. Sullivan, the title of your testimony is ``Measuring
the Value of DOD's Weapon Programs Requires Starting with
Realistic Baselines.'' And I think that that is a point that is
made very forcefully in your testimony this morning and beyond
that. And you highlight the importance of the realism of
acquisition plans.
I think it is important to note that, not only is that an
important measurement tool, but it has everything to do with
whether the Congress can make decisions based upon realistic
assumptions. You know, if a system is sold to us on the basis
that it is going to cost $1, and a realistic projection would
be it is going to cost $1.25 or $1.40, it has a very different
set of dynamics that would then take place in our decision-
making.
So, this is really a separation of powers issue, in a
sense, that for us to make an intelligent, clear-eyed decision
about what to do, we need better data on which to make that
decision.
What recommendations would you make? And I know you do this
in your written testimony, but I would like you to elaborate.
What recommendations would you make to improve the accuracy and
transparency of the planning process, the standard-setting
process that goes into these decisions?
And secondly, to what extent do you think the new 5000.2
guidance moves us in that direction?
Mr. Sullivan. Okay. First of all, I think the 5000--the new
5000 guidance--does a lot, I think, to move us in that
direction.
Where we would, as auditors, where we would--and people
that are interested in oversight--we have problems with what is
required and what is suggested. And, you know, often times in
those policies, there is a lot of wiggle room and a lot of
encouragement to do things, but not a lot of requirements to do
things, so we always take issue with that.
Having said that, the principles in that new policy--and
Mr. Ahern went through some of that----
Mr. Andrews. Right.
Mr. Sullivan [continuing]. Address a lot of the things that
we think have to be in place to improve these weapon system----
Mr. Andrews. Tell us what some of them are.
Mr. Sullivan. For example, at the beginning, the initial
decision--it escapes me for the moment----
Mr. Ahern. MDD.
Mr. Sullivan [continuing]. The MDD decision is much more of
a joint decision today. I think they have tried to bring in--
you know, there are three big processes. There is a
requirement-setting process, the funding process and then the
acquisition execution process. And they have trouble speaking
to each other a lot of times.
I think there has been a real push to try to get them
together, so that there is agreement, a joint agreement on how
to move forward with a program. That is one way.
The Analysis of Alternatives that you spoke of, I think the
policy is trying to tweak the way that they analyze
alternatives and try to--and is trying to bring a little more
jointness into that, for example, so that you do not have--in
the past, an Analysis of Alternatives was done by perhaps one
of the services.
And if it was the Army, they were looking at something that
had wheels. And if it was the Navy, it was something that
floated. And if it was the Air Force, it flew. And so, there
were probably a lot of options that did not get full
consideration in that.
As you move to more jointness, I think you get a better
idea----
Mr. Andrews. Do you think it is also characteristic in that
AOA that there was not adequate consideration, simply revamping
an existing system as opposed to starting all over again?
Mr. Sullivan. Probably that was the case. And so now, the
guidance does state that the number for--you know, when you
have a new threat or something that you have to counter, the
first thing you do is look at doctrine. You look at training,
you look at a different way, or modifying existing systems.
You know, that is a tough one to answer. But I think a lot
of programs do get started that probably should not. You know,
they probably could have found an alternative way to do things.
Mr. Andrews. Are you satisfied that there is enough
guidance on looking to the commercial world as a place to look
for solutions to these needs that are identified, in the AOA
process?
Mr. Sullivan. I am not satisfied that they do enough of
that.
You know, usually when there is a--when they are thinking--
well, I will give you an example of kind of commercial, off-
the-shelf programs that the department has tried to start in
the past. I think it is fair to say that programs like
Warfighter Information Network-Tactical (WIN-T), which is a
communications program, was designed to kind of take
commercial, off-the-shelf items and modify them a bit, and make
them available to the warfighter. And that did not work very
well.
There has been a lot of talk about the Presidential
Helicopter and how that has gotten way out of control, because
they were thinking that would be a commercial, off-the-shelf
item. And it came in, often times----
Mr. Andrews. Who has been talking about that? I do not
know.
Mr. Conaway. First I have heard of it.
Mr. Sullivan. The requirement-setting process has an impact
on all that stuff.
Mr. Andrews. Right.
Mr. Sullivan. Once you have a feasible idea to do something
commercially, and the users kind of start looking at it, all of
a sudden requirements start getting piled on that, and it
becomes something much different.
Mr. Andrews. Well, thank you, gentlemen.
I am going to turn to Mr. Conaway for his questions.
Mr. Conaway. Well, thanks, gentlemen. I appreciate you both
being here.
You know, it is pretty presumptuous of us to think that we
are going to ever know as much about acquisitions as the panel,
or probably a bunch of you all sitting in the room.
I guess our role, though, is to try to elicit from you the
solutions, because I have got to believe that with the vast
background that both of you have, and everybody else across the
system has, nobody wants us in the position that we are in
today. And so, having us ask the right questions, I think is
the best job that we can do in this circumstance.
On the mandatory development decision, I guess, MDD, is
there enough rigor there to make sure that the folks making the
estimates are not simply--and this is a bit crass--but simply
low-balling the estimates in order to get the program started?
Because once something is started and the initial inertia is
overcome, it moves, whatever happens.
And so, is there enough, you know, auditing or somebody
checking the guys doing these assessments in that initial phase
to say, you know, these estimates are not realistic?
Mr. Ahern. I think you are talking to me, sir, congressman.
The MDD process, we really do not get too hard into money.
But as was described in my--into the funding that is going to
be required--that really is a transition period between the
Joint Capabilities Integration and Development System (JCIDS)
process, as Mr. Sullivan mentioned, and ``little a.''
Let me elaborate for a minute what I expect in the MDD. And
I am absolutely a proponent of it. I think it is really
critical.
As Mr. Sullivan said, it is the place where we get both the
resource sponsors, the senior people from the Joint Chief of
Staff (JCS), as well as ourselves, in a room and talk through
what it is we are trying to do. And that is predicated on a
good job over in the JCS arena of going through the Concept of
Operations (CONOPS) for the system, whatever they are talking
about, going through an Analysis of Alternatives themselves, a
smaller Analysis of Alternatives.
Is a materiel solution needed, or can we change training
and doctrine and get the job done, and that capability gap in
that way?
If a materiel solution is needed, then they neck it down.
And they have an Initial Capability Document (ICD) that is
actually approved by the Joint Requirements Oversight Council
(JROC), of the very senior group in the Joint Chiefs of Staff
(JCS).
So, coming into the MDD, the JCS has said, this is a
capability gap that we need to fill, and it needs a materiel
solution.
In the MDD itself, with that as a starting point, and the
JCS briefs that ICD what it is they want from a materiel
solution, then we talk through what the Analysis of
Alternatives needs to be. And that is where we kick off and
start the Analysis of Alternatives.
We have all had an opportunity to look at the guidance for
the Analysis of Alternatives. And it is going to start,
Chairman Andrews, with, can we modify the current system? That
is absolutely already there.
And then it will gradually go up in a ladder step, if you
will, all the way to pressing technology to a really--got to
choose my words carefully here, gentlemen--but you can have a
number of alternatives, one of which, the last of them could be
we have got to push technology to get what it is.
But then, in that Analysis of Alternatives, you have got to
look at measures of effectiveness. How is it supposed to
perform? What reliability are you expecting of it? What
suitability are you expecting of it? What survivability are you
expecting of it?
And then you need to look at the costs. And so, what we
will do in the MDD is, we will talk through that acquisition,
the Analysis of Alternatives plan. Is it rich enough? Is it
robust enough? Is it considering the alternatives that you are
talking about?
And I have done one of these. I am not just making this up
as I go along. We have been through one.
I structured very carefully to be sure that we had that
dialogue, because we have the senior stakeholders in the room,
and we want to get the benefit of their advice and counsel as
we are going forward. And then we will talk through it.
Of course, a very significant part of the Analysis of
Alternatives is, in fact, the cost analysis versus the various
alternatives that we are talking about. If you are going to
modify a system, it should be less expensive than pushing
technology. And that needs to be compared.
What are you going to get in terms of suitability and
effectiveness versus cost? That is part of the outcome. And do
you understand the environment that it is going to be operated
in? So, we talk through the Analysis of Alternatives plan.
And finally in that MDD, we give the program manager--and
there is a designated program manager for all of them--an
opportunity to tell us what he thinks the whole program will
look like. Now, that is early to need.
But there will be funding associated, because, as you all
know, we do a Program Objective Memorandum (POM) over a number
of years. And there will be a rudimentary schedule--not
prejudicing the AOA, because remember, the program has not
started yet at the MDD.
Following the MDD, we go to a milestone A, which is, as Mr.
Sullivan recognized, the beginning of the technology
development phase. And following that technology development
phase, then we will get into the product development phase.
That is where we snap the chalk line and put down the basis of
the program, of the cost estimate.
But we do want to have the program manager talk to us in
general terms about how they are looking at this program, how
long it is going to take to get that capability to the
warfighter.
So, I hope that in a brief----
Mr. Conaway. Well, I would be----
Mr. Ahern [continuing]. Brief, why it is the MDD is
important to us, and what we do in that MDD and how we go
forward with it.
Mr. Conaway. A classic example of not answering my
question.
I was more focused on the cost estimate and a rigorous
review of that, wherever it fell in the system.
Mr. Ahern. Okay.
Mr. Conaway. I am not sure it was before we decided to go
forward we decide how much it is going to cost, or after we are
into it and we decide it is going to cost.
Mr. Ahern. Can I answer that?
Mr. Conaway. But----
Mr. Ahern. I am sorry.
Mr. Conaway. Yes, but I want to make sure that whoever is
responsible for doing the cost estimate--I come from an oil and
gas background. And when you get a geologist who has put
together a new prospect, they get in love with it. And they
want it sold, they want it done. The problem is, they have got
to go sell it to a third party, who takes a different look at
it than the guy putting it together.
So, I do not want our folks so in love with their
prospect--and in this sense, it was an acquisition system--that
they lose sight of--lose objectivity.
Is there in that, somewhere we snap the chalk line, an
independent--not necessarily the GAO--but an independent review
of this whole system to that point, so we make sure we do not
have folks who have fallen in love with a system and are no
longer objective on the costs and all these other things you
mentioned?
Mr. Ahern. Yes, sir, there is, absolutely. There is an
independent cost analysis done on all the major systems by the
CAIG, the Cost Analysis Improvement Group, on every system.
So, when we come to snapping the chalk line at what we call
milestone B, the beginning of EMD, we will have two separate
estimates to look at, one from the service and one an
independent cost estimate----
Mr. Conaway. Okay. One quick follow up.
Mr. Ahern [continuing]. Done totally differently.
Mr. Conaway. The 5000.2 regulation from last August, are
you going back and applying those to all existing systems? Or
is that just for new systems going forward?
Mr. Ahern. It will apply to existing systems, depending
upon where they are, sir.
One of the programs that is already in production, it
probably will not see too many changes, based on the new
5000.2.
Mr. Conaway. Even if----
Mr. Ahern. But if a program has just started, we will
absolutely start it in the last six months or so.
Mr. Conaway. But if changes could rein in some cost
overruns, though, you would do that, wouldn't you?
Mr. Ahern. No, sir. We would not force anything in. We have
not fundamentally changed the sequence of events that a program
goes through. What we have tried to do is increase the
discipline in following that sequence of events.
As Mr. Sullivan remarked, there is room to tailor--or as he
said, ``wiggle''--inside the 5000.2. What we are trying to do
is reduce the wiggle and ensure that we have a disciplined
process that we follow for each and every of the acquisitions.
So, I do not think we would--we are conscious of what you
are saying, sir, and we would not drive cost into a program to
adhere to the 5000.
Mr. Conaway. You would drop cost out of it, though?
Mr. Ahern. I think we will. I am really keen on the idea of
doing that technology development phase after we do the
Analysis of Alternatives, to go to that phase where we will
look at competitive prototyping--and we have already done a
couple of programs along those lines--where we have two or
three competitors putting together either an entire prototype
or key elements of the system, and demonstrating it.
And that, as I said in my remarks this morning, that helps
us to understand, is the technology available. And frankly, in
putting together a prototype, it can give us a real leg up on
cost estimating.
So, I think that this new 5000.2, with its emphasis on that
phase, should, in fact, help us to drive costs out of programs.
It will tell us about technology maturity, and it will tell us
what is doable within a period of time.
Mr. Conaway. Thank you, Mr. Chairman.
Mr. Andrews. Thank you, Mr. Conaway.
Mr. Cooper.
Mr. Cooper. Thank you, Mr. Chairman.
Thanks to the witnesses.
It seems to me that, if there has ever been an alphabet
soup of bureaucratic quagmire, this is it, with 130-some
attempts to reform the system since World War II. And I am not
sure that anyone can even understand whether any of these
attempts or reforms worked or not. So, I guess our first job is
to not make the problem worse.
When the Secretary of Defense told us at breakfast last
week that there was something like 50,000 private sector
contractors whose only job is to oversee other private sector
contractors on things like contract performance and things like
that, I think it gives us and the public an idea of how
monstrously complex this whole process has become.
It seems to me that--and forgive me, because you gentlemen
have spent decades studying this, and we are largely new to the
complexities of this topic--that some of these problems are
self-inflicted wounds.
Mr. Assad talked to us recently and said basically that it
is the CAIG that comes up with much more realistic cost
estimates. But sometimes politicians and contractors refuse to
listen to those, and we prefer the lower numbers, however
unrealistic they are.
So, when it comes to setting and enforcing baselines, well,
we sometimes enjoy self-delusion, because, guess what, the
numbers usually, almost always turn out to be higher.
I wonder about things like the rapid acquisition process,
if it is essential. And maybe this is just for smaller systems.
But if we can somehow bypass our own bureaucracy when we need
to, why don't we do it more often?
I also wonder about if there is any good news in here? Is
there a pony here somewhere? Are there certain systems that are
so astonishingly reliable or productive or necessary for the
warfighter? You know, have any contractors ever been rewarded
for those?
In my area, we fly still a lot of C-130s, most of which
were built before I was born. And they are still going, and
they are still the warhorse. They are still reliable.
So, I would like to leaven the bad news with the good, if
there is any good news. And I still want to be reminded that we
spend more on our defense than almost every other nation on
Earth combined.
So, we are the policemen of the world. We are the
warfighters of the world. You know, the value for the taxpayer
is increasingly essential as our taxpayers are losing patience
with lots of different things that we are undertaking.
So, forgive me for the general sort of take on this. But
when gentlemen like you come to us and say there are strategic
and tactical failures in the procurement of our essential
weapons systems, that is from top to bottom. That is the
military. That is the Secretary of Defense (SECDEF). That is
us, the White House.
So, this may be a task far larger than a simple panel can
undertake, but we appreciate your guidance. And if, in the time
remaining, if either of you would care to reflect, I would
appreciate it.
Mr. Sullivan. Well, you know, I would like to go back to,
after hearing you speak there, the chairman asked the question
to me, what are the things that should change. And I do not
know that I answered it real well.
It is those prerequisites, though, that I talked about
earlier. And when you look at what you just described, we kind
of describe that as--I do not want to use the word
``failures''--but a lack of success, if you will, at a
strategic level, and then down into the execution phases.
And at a strategic level, if you want something good to
happen, probably the first thing, the first prerequisite is to
have less programs vying for this money that is available. When
you have the number of systems--and one of the reasons there
are 96 systems that are in the portfolio now, is because it is
a relatively service-centric kind of a system, so you have a
lot of parochialism.
And I think sometimes where you could have a joint
solution, or you could have a solution that does not have to
become an acquisition program, the services kind of compete
with each other to get programs started.
So, at a top level, Office of the Secretary of Defense
(OSD), the Secretary of Defense can do a better job of getting
a handle on that, trying to control the service-centric aspects
of this and try to reduce the number of programs that are
really competing in unhealthy ways for that limited dollar that
is out there.
Mr. Cooper. The average tenure of a SECDEF has been 16
months over the last 40 years.
Mr. Sullivan. And in fact, the average tenure--we looked it
up--of the Under Secretary of defense for acquisitions is
around 16 to 18 months. So, there is no real good continuity.
That is an excellent point.
I do not know what the answer to that is, unless there is
an undersecretary position that can somehow have a time term,
or something like that, that would be able to stay in place
longer.
But the turnover really affects a lot of this. You cannot
prioritize properly.
Once you cannot kind of keep control of the number of
programs that are beginning, when you get too many programs
into this portfolio, you get unrealistic baselines as a result.
And I would say, you know, that is the other thing, that you
need to have requirements that are analyzed a lot more.
I think the key thing that the new 5000 policy does is the
preliminary design review that they are calling for now. They
want to do that right around the milestone B, which is where
they snap the chalk line, as Mr. Ahern--that is when they start
spending the big money. The earlier they do that, the more
realistic estimates they will get, if they are doing that
properly. It is a lot of systems engineering that has to be
done early.
And that kind of sorts out, you know, risky technologies.
And unrealistic requirements are going to drive unrealistic
cost and schedule estimates.
If you do not understand the requirements that the user is
coming up with, and you do not have discipline there to say we
cannot do all that right now--you know, there are technologies
that you have to develop; we can get that in the next
generation, but not now--these programs will be hard to
control.
Mr. Cooper. Thank you, Mr. Chairman.
Mr. Andrews. Thank the gentleman from Tennessee.
The gentleman from Colorado, Mr. Coffman, is recognized for
5 minutes.
Mr. Coffman. Thank you, Mr. Chairman.
Mr. Sullivan, can you give me an example--I think you
mentioned immature technologies--can you give me an example of
that?
Mr. Sullivan. Well, for example, programs. You know, I hate
to single out programs, but I will take one that is almost
done, I guess. The F-22, for example, had technologies that
were part of the key performance parameters of the aircraft
that were very immature when they started milestone B and
started spending the big acquisition dollars.
And they fused avionics on that aircraft. They did not
understand those technologies well at all. In fact, some people
would say they were not invented when they opened up the
factories to build that, some of the propulsion technologies.
Some of the stealth technologies they were not real sure about
on that one.
I do not want to single out the F-22. You can pick almost
any major--even the C-17, which was a relative--you know, it
was a big cargo aircraft with relatively mundane requirements,
not for a cargo aircraft, but in general. And they had some
technology issues on that with some of the material technology
they were using that caused them a lot of problems. Very
immature technologies on that.
You can name almost any really made--the Future Combat
Systems, as, you know, we have looked at that and found that--I
do not know the exact numbers, but it is maybe 50 or 60
different key technologies that are supposed to drive that
system. Probably the majority of those are too immature to be
in product development.
And the way we look at that, that we have something called
technology readiness levels that, actually, we recommended that
the department begin using years ago. And the department has
started using those.
And, in fact, Director of Defense Research & Engineering
(DDR&E) does these technology readiness assessments that Mr.
Ahern referred to, so now they are doing that on every major
weapon system acquisition before it begins. They go in and look
at those and assign technology readiness levels to those
programs.
Some are still beginning with technologies that are too
immature. But I would say they are getting better at that.
Mr. Coffman. And you said words to the effect that there
are some programs that are not good candidates, that the system
does not necessarily ferret out programs that are the best
candidates.
Mr. Sullivan. You would like examples of those?
Mr. Coffman. Could you give one example of that?
Mr. Sullivan. Well, that was a kind of a general--more of a
general statement.
I think that it would be--if you want to consider a good
candidate to start product development, what we think the best
practice for that would be is a candidate where you understand
the requirements, you have looked at the requirements and
determined that there are things you can do and things you
cannot do, and you have gotten ridden of some of the
requirements that are not doable.
You have looked at your funding, and this program fits into
a funding profile that the department can count on, and you
have looked at technologies and things like that. And all of
these things fit.
I would say that hardly any of the major weapon system
acquisitions are good candidates, according to that criteria.
Lately there have been a couple, I think, that we are looking
at now that we think the department is doing good with.
I do not know. Small diameter bomb is an example of that,
where they have really looked at those requirements.
Mr. Coffman. Okay. Let me ask a question of both of you.
In 1992, I was with the Marine Corps, and came up here on
Capitol Hill and had a meeting with the officer in charge of
the Marine Corps liaison program. He was a brigadier general
whose name escapes me right now, but he said something that I
have never forgotten. And I would like if you would both
respond to this.
He said that, kind of almost--I will paraphrase it. We get
weapons systems that we neither want nor need, but are based on
a congressional--but are manufactured in a congressional
district whose congressman sees it as a jobs program for their
district.
Can you respond to that?
And that was his statement in 1992. First of all, did you
think that that statement was reflective of the environment in
1992? And is reflective of the environment today?
Mr. Ahern. No, sir.
Mr. Coffman. Okay.
Mr. Ahern. I mean, let me say--I mean, as you read my
biography, I was program manager and Program Executive Office
(PEO) at that period of time. Obviously, I am aware that there
are--congressmen have industry in their district.
But I have never felt--never, ever felt--that we were
pressured, encouraged or in any way directed to do anything
that entered into that. I honestly think that we made mistakes,
that each time we started down the road toward a product
development, we have done it as well as we can.
We appreciate your interest and support, but I have never
felt that we have been pressured into doing something for that
kind of a reason.
Mr. Coffman. Okay.
Mr. Andrews. If the gentleman would yield just a moment, I
would like to piggyback on this question.
Mr. Ahern, could you say the same thing about a situation
where there has been a cost overrun in an existing program and
there has been an effort to limit it or eliminate it, there has
not been congressional pressure to resist that elimination or
cutback?
Mr. Ahern. Well, as you know from my portfolio, I have been
through a number of cost growths, because it is a difficult
job. And again, I can say, yes, sir, never has happened. Never
has happened to me personally, and I have had some fairly
senior jobs in the Pentagon and in the Navy and in the Air
Force.
No, sir. It has absolutely never happened to me. And to my
knowledge, the work that we have done in the Pentagon, the
reviews, the Nunn-McCurdy reviews, I have not heard that at
all, sir. No, sir.
Mr. Andrews. Mr. Coffman, thank you. Mr. Ahern, thank you.
Mr. Coffman. Mr. Chairman, I am wondering if Mr. Sullivan
might be able to respond.
Mr. Sullivan. Yes, I think the first part of your question
I can deal with, and that is a warfighter saying that they did
not get what they need. The reasons are many-fold.
But the one that I look at most is, a lot of times a
warfighter does not get their urgent needs, because we are busy
working on very highly complex, single-step-to-big-bang
capability systems that are draining a lot of funds and time
and energy, like the F-22 or the Joint Strike Fighter or the
Future Combat System.
Or you can go across the board. A lot of these big, you
know, kind of unachievable requirements programs take so much
time and so much money, that I think a lot of times the
warfighters--there are bill-payers out there for these, right?
We all understand that.
So, there are smaller programs that have to pay the price
when an F-22 program begins with an unrealistic cost and
schedule estimate and every year needs to be plused up in order
to get the development through. And so, people pay for that.
And I think that is something that needs to be looked at,
as I think the warfighter suffers that way, because these big
programs are taking the dollars.
The other thing I would say is, if you look at the Mine
Resistant Ambush Protected (MRAP) vehicle acquisition, the MRAP
acquisition actually was pretty good, you know. Once they
focused on the fact that they needed to counter this threat,
and it was an urgent threat, they moved very quickly to get
MRAPs to the field.
Before that, I think that this acquisition process hindered
the ability to do that, because there were other acquisition
programs--you know, the Army, for example, had other programs
that it needed to fund, and things like that. I think it took a
long time for people to accept the fact that the MRAP had--we
had to put money into this, we had to go with requirements that
were doable right now, and we had to meet an urgent need.
Once those decisions were made, that went pretty well.
Mr. Andrews. If I may, just also, I think one of the
reasons it received such a high level of congressional scrutiny
on an almost daily basis, Chairman Hunter at the time, Mr.
Skelton watched the process like a hawk, as did several other
members. And I think there is some subjective value--some
suggestive value in that.
Mr. Ellsworth is recognized.
Mr. Ellsworth. Thank you, Mr. Chairman. And I think my
comments to be more general.
First, in fairness, I want to say that we brought up U.
Conn.'s basketball program. I thought I had better bring up Mr.
Sullivan's alma mater----
Mr. Sullivan. Maybe not right now. [Laughter.]
Mr. Ellsworth. They have been known to throw a few balls
through the hoop over the years, so hopefully they will get
back to that.
Mr. Sullivan. Thank you, sir.
Mr. Ellsworth. When I first got put on this panel and you
distributed some paperwork, one of the charts that jumped off
the screen was the actual chart that showed the acquisition
process. It reminded me of a Dr. Seuss configuration. And that
was funny in the Dr. Seuss books, but in military acquisitions,
I did not find much humor in that, and thought, you know, how
in the world can anybody navigate their way?
And as I still try to learn that system, there are just
things that jump off the page. You would almost have to think
it was planned confusion. And I hope that is not the case.
But it would seem to breed the shirking of responsibility,
how we go through, and how many points we could push off the
responsibility on these decisions.
And another thing comes to mind, and I think that Mr.
Coffman said it well. And I heard the president--I think Mr.
Cooper and I were at a meeting at the White House a few weeks
ago where the president said, ``I have to make decisions on
national defense really based on national defense, not on a
congressman's district and what they make there.'' And so, that
gives me great hope that those decisions will be made on that.
One of the things we also have to be--and maybe you could
talk to this--is fairness to the contractor. You know, when we
are building the hull of a ship and signing that contract, and
then it is designed as we go, as we build up from the basic
platform up, not unlike the Congressional Visitors Center,
where it started off at $300 million and kept changing and
changing and changing, and then went to $600 million. I think
it is the same with the helicopter, if we keep adding things as
we go.
We have also got to be able to give these--nobody is going
to go out and just build a ship and hope that the United States
government buys that ship, or get ready and build a factory
that is ready to go, whether it is two submarines a year, one
submarine a year and say, hey, I hope they up that someday.
We have to give the contractor some kind of vision of what
we are looking at. I know that is tough in two-year cycles and
six-year cycles and every four years. But I think we have to
give them something to look at when they are putting employees
together and buying equipment, and how we address that along
with rapid acquisition.
Where is the middle ground there between the normal
acquisition process and rapid? Is there something in between
there that works better, simplifying it, making the steps down?
So I guess, if I had direct questions, it would be, you
know, maybe we can look at the percentage of programs pulled
just because of cost. The warfighter needed it, but it was just
too expensive.
Or the percent pulled because we thought we needed it. It
sounded good. It would be nice to have. But then we realize
halfway in, maybe we do not. It was a great wish, but we did
not really want it.
And I am just talking in general. I know that this is our
first meeting, Mr. Chairman, and I appreciate it, as we try to
get our arms around it. But maybe you could address the
fairness to the contractor and how we look to improve that
system also.
Mr. Sullivan. The way I look at that is--that is a really
good question, because obviously, we are in a--the defense
industry is an industry that does not necessarily build things
in volume, because--you were kind of iterating to that.
There are not real commercial markets for these things.
These are going to be--you know, you build a couple of
submarines and you are done. You build a number of fighter
aircraft, and there is no more market for that. There is no re-
sale. There is nothing like that.
So, you are starting out with an industry that--this is
very capital-intensive, very labor-intensive and low volume.
And in addition to that, it is cutting edge.
And I do not think any of us will ever get to a time where
it is not risky to build defense acquisitions, these big weapon
systems, because we want them. We want the best in the world.
The risk could be a lot less than that.
So, what the department usually does with these big
contractors, the way they deal with that, I think, is with
these cost-plus contracts that are a necessary part of doing
business, because of what you referred to.
No company is going to, on their own nickel, begin to
invest in the facility and the tooling that they need to take
care of the government, because that is the only game in town.
So, if they do not get it, they have wasted their money.
So, the government pays for all of that. So, that is kind
of the fairness to the contractor, I guess, is we kind of take
on their risk.
On the other hand, we look at the funding on an annual
basis, and never really give contractors on some programs the
stability that they need, the security they need in receiving
those funds on a year-to-year basis to be able to do their job
better.
And when you combine that with the requirements that we
have already discussed, when you have a contractor working on a
weapon system that has capabilities that they really are not
able to build right away, and they have this cost-plus
contract, and they have the President and the Congress every
year looking at their budget, it does create a really unstable
environment for them to operate. So, those are some things.
Requirements are unachievable. Cost-plus contracting comes
up a lot. I think a lot of people talk about getting rid of it.
I do not think necessarily reasonable people think that is a
good idea. But if you can get to a position where you build--
where you have requirements to build things that are doable,
the cost-plus contracts make a lot more sense.
Mr. Ellsworth. Thank you both very much.
Thank you, Mr. Chairman.
Mr. Andrews. Thank you, Mr. Ellsworth.
Mr. Hunter is recognized for 5 minutes.
Mr. Hunter. Thank you, Mr. Chairman.
And in the interest of time here, we will start off with my
first question. In 2003 to 2008, we have had kind of a
permissive environment. I think you would agree to that.
Who was responsible, if you had to lay it out in terms of
who was most responsible to least responsible for the
permissive environment, that maybe the services and Congress
and contractors took advantage of and are now getting reined
in? Who is responsible? When it comes to DOD, Congress,
contractors, who would you lay the blame on the most from 2003
to now?
Mr. Ahern. I started here in 2006. And, of course, I was an
instructor before that.
I would not go to blame. I would say that the discipline in
looking at the system, at the elements of a development going
forward, were not as rigorous then as they are now. But I am
not sure that I could say that the blame is there. The system
was always there. The system has always been there.
And there have been some good programs started, and they
are continuing to execute through that period of time. I think
of the P-8, the Navy's replacement for the P-3, which I think
was started in that period of time.
So, I am unable to ascribe blame. I think a number of the
programs that are in the portfolio Mr. Sullivan mentions are,
in fact, executing over the last 5 years quite well. I believe
that that is in his report.
So, I am not sure that there was a permissive environment
in that period of time. I think that what we have done
recently, or in the new 5000, is to add--and in other things
that we have done--to add additional expectations.
But I would not agree that there was a permissive
environment during that period of time. And I think, as I said,
there are a number of programs that have not shown cost growth,
that were started in that period of time.
Mr. Hunter. Almost every service acquisition officer who I
have talked to in the last--I have only been here for about 2.5
months. But everyone that I have talked to said there was
indeed a permissive environment, and that contractors took
advantage of it--not necessarily at the big level programs, but
down at the lower levels, middle levels, and that that is being
reined in now.
And industry is suffering to some extent. The services are
having to be more responsible, and DOD is having to be more
responsible, too.
Mr. Sullivan. One of the comments I would make is that, you
know, one of the things that started around, in the 2000
timeframe is when we started having these acquisition programs
like Missile Defense Agency (MDA), Future Combat Systems, Joint
Strike Fighter. Some of these very, very complex what they call
systems of systems, as opposed to just building a weapon
system. You had the Future Combat System, which is a--
ostensibly, there are 19 programs kind of working within that
that somebody is trying to coordinate.
Joint Strike Fighter is three separate fighter aircraft
variants that Lockheed Martin is trying to do at one time.
There are a lot of programs out there like that. So, I do not
think the government particularly did that very well.
The MDA is an acquisition program. It is an agency. And it
has--I do not know the number now, but it is in the teens of
highly complex programs that are supposed to work together.
So, I think one of the things that happened is that it got
much more complex, so complex that we cannot understand it too
well.
And I think the permissiveness, a lot of that might have
come from what happened. You know, there was an attempt at
acquisition reform in the mid 1990s.
Mr. Hunter. Well, as complexity goes up, costs go up and
everything else goes up.
Mr. Sullivan. I think absolutely, yes.
Mr. Hunter. Let me jump in here, because we have to run.
How do you--and this kind of goes to those mid-level
warfighter programs that guys really need, programs like Joint
Improvised Explosive Defeat Organization (JIEDDO), Task Force
(Odin) and things that we funded, because of immediate current
threats that the warfighter has begged for, but that Congress
almost has to force on the different services. And these
programs start off agile and responsive, and they end up slow,
unresponsive, bureaucratic.
And these are all programs where the ongoing operation is
just as important as the initial acquisition of them.
And it seems like everything works well when it first
starts off. It is lean and mean, and then it kind of gets out
of control, once they get their billions and they hire 300
bureaucrats. Everything slows down and they become
unresponsive, and they do not necessarily do what they were
intended to do in the first place.
But that is more of an ongoing thing, but it still has to
do with the initial acquisition and programs not doing what
they are supposed to do after the initial acquisition.
How do you fix that?
Mr. Ahern. Well, there are a couple of things that occur to
me immediately, congressman. One is the MDA that Mr. Sullivan
mentioned a couple of minutes ago. That was initiated, I think,
in 2002 timeframe, and is one that had a very near-term
expectation that they would develop a capability by--develop
and field a capability--by 2004.
And then, as time went--and then continue on. And I think
the MDA has a number of elements in development moving toward a
block capability that will enable them to expand upon that
initial capability.
And what we have done with the MDA is to ensure that they
continue to have that Research, Development, Test & Evaluation
(RDT&E), or that technology development focus, and move toward
transitioning the capabilities that they are developing to the
services to operate as rapidly as we can. It requires focus in
that area.
Another one that comes to my mind that we are working on
now is the Intelligence Surveillance and Reconnaissance (ISR)
task force that Secretary Gates started. And that is one that I
support from an acquisitions standpoint. And it is absolutely
to push ISR resources into the operating areas as quickly as we
can.
I expect it to operate for a period of time. I expect to
continue to push on it. And the same kind of thing is happening
with the MRAP.
And it really is, as--it is up to the people, the
discipline that I mentioned earlier on, to have, continue to
have that interest in getting the job done to support the
warfighter. And I cannot say it any better than that.
We have to do our jobs responsibly to ensure that we do
maintain the focus on the urgent operational needs, the support
for the warfighters, as I think we are doing in the ISR task
force, as we did with the MRAP and continue to support the
MRAP, and as we are doing with the MDA. Those are three
examples that occur to me.
Mr. Andrews. Thank you, Mr. Hunter.
The chair recognizes Mr. Sestak for 5 minutes.
Mr. Sestak. Thanks, Mr. Chairman.
I honestly believe that the requirement side, particularly
since General Cartwright used to have J8, really has come a
long way from the Cost Benefit Analysis (CBAs), and the
modeling he has done down there. So, I want to focus more on
the acquisition side.
But could I ask you, sir, if you could get back to us? My
limited understanding is that under Instruction 5000, all the
modeling that is used on the requirement side has to go through
a validation, verification and analysis by an independent
somebody to make sure the models used are well. My
understanding is, probably only 5 percent have.
Could you get back to us on that----
[The information referred to can be found in the Appendix
on page 80.]
Mr. Ahern. Yes, sir.
Mr. Sestak [continuing]. Because we are using models in the
requirement side that have never been validated. How good are
they?
So, I think that is what I would really like to know. My
understanding is, even though there is a requirement to in
5000, hardly any of them have. And I think that is something
that does not bode for credibility on the modeling being used.
The other side of the acquisition side, I was quite taken
by your comments about the Earned Value Management (EVM)
program, where we are supposed to forecast final cost. Also,
sir, with you, I was taken with how you want to try to
establish prerequisite indicators to try to get realistic
costs, to kind of break this tyranny of optimism we have.
The other day, when the Littoral Combat Ship (LCS) came
forward, initially it was supposed to be $250 million. Now it
is $460 million. The internal DOD figures only have a 50
percent confidence factor that that is going to be the actual
cost.
I guess my point is, we have Nunn-McCurdy. Thirty programs
the last three years have come over to us, breaking, breeching
Nunn-McCurdy. Nice monitoring system, but no enforcement, no
teeth on it.
You mentioned the CAIG, sir. CAIG estimates are not
mandatory, yet they turn out to be, by and large, much more
right than the services. Should we, A, make the CAIG estimates
mandatory?
Number two, shouldn't the Congress have exposure to those
confidence factors that you have internally to the services,
for example? The aircraft carrier that is going to be built
here in a few years is going to cost, according to some
estimates, $13.5 billion. But the internal confidence factor of
costing on that is 37 percent.
We are building a lot of Virginia class submarines. But the
internal confidence factors on the costing for the Virginia
class submarine--two of them being built here in the next
years--is less than 50 percent.
Should Congress here have access to that data, so before we
commit to something like an LCS, we know at that time maybe
that 20 percent confidence factor before we get too far down
the road?
Sir.
Mr. Ahern. Sir, taking it in terms of the CAIG being
mandatory, I think we need to maintain that balance. The
experience that I have is, we tend to use the CAIG estimate for
the estimate for the program. But we need to have that dialogue
between the CAIG and the service estimate, the program office
estimate, because they are different. They are done
differently.
And I think that that dialogue is necessary. And if we made
the CAIG mandatory, I think that the service cost estimating,
which enables that dialogue, would----
Mr. Sestak. Do you think we should have access to both
cost----
Mr. Ahern. Well, I think you should have access to the cost
estimates.
Mr. Sestak. I mean, if they are not an----
Mr. Ahern. I am not sure----
Mr. Sestak [continuing]. Independent one, like you said,
shouldn't Congress know before we dole the national treasure
out, what CAIG's estimate is versus the service's, or at least
the confidence factor that is coming forward?
Mr. Ahern. I think we do send over the--I mean, the results
of that estimating is done in the Acquisition Program Baseline
(APB) that is reported in the Selected Acquisition Report (SAR)
reports for the programs.
So, I think that the results of that work is absolutely
reported to you in the SARs that come over on all the major
programs when they initiate them at a program milestone. I
really do think it is there.
Mr. Sestak. All right. I probably missed it.
May I ask another question? I think my time--oh, I have
time for one more.
My question is on the more jointness. I was always quite
taken with what Representative Skelton and many did with the
JCS, you know, with having--back in the Goldwater-Nichols day,
when the chairman walked in after that was passed, everybody
else stood up. Before that, the JCS would not stand up when he
walked in the room, because he was one of equals.
To some degree, do you think, as we have a wonderful
system, including having OSD involved in the JCIDS process now,
which it was not before--I mean, a few years ago they were
brought in--do you think that we need to structure the JROCs,
but not so much as what we did to the JCS, so it is not
everybody is equal there? You may get the least common
denominator. It is my program versus your program.
But do we need a Goldwater II, in a sense, to say there
should be one final requirements guy before it goes any
further, rather than least common denominator. Any comments?
Mr. Sullivan. Well, my comment on that is that, you know,
Goldwater-Nichols was to make jointness. And it did it on the
operations side, I think. You know, if we see the way----
Mr. Sestak. But not the procurement side.
Mr. Sullivan. Not the procurement side. So, there, I think
you raise a really excellent point.
These programs have to be acquired jointly, and they are
not right now. There is still too much service-centricity, even
in the JCIDS process, which, when you read the JCIDS policy, it
is really pretty good policy.
Mr. Sestak. It is.
Mr. Sullivan. It establishes functional capability boards
and a lot of joint matrix processes that are supposed to look
at requirements and weigh in in a purple-suited way, if you
will.
Those things are not in effect right now. They have not
staffed them properly.
I think you still--I do not think that the policy has been
implemented well.
Mr. Sestak. I am out of time. But my only thing is, when it
finally gets to the JROC, it is wonderful work up till there.
But that final decision, I just do not know if you need to
change it like the JCS----
Mr. Sullivan. Yes, if I could just comment briefly on one
other thing you were talking about, the CAIG estimates.
We have done a lot of work in that area. And we found that
the CAIG usually has a little bit better accurate estimate, but
still, far from what the outcomes usually tend to be. And the
department does not always accept those estimates.
And I think that an independent CAIG would be something
that might be helpful to the Congress, something similar to how
they do Director of Operational Test & Evaluation (DOT&E). They
established an independent director. I think that is something
that might be helpful.
Mr. Ahern. I would like to go back and comment on Mr.
Sullivan's last comment. I think that the more we emphasize
that need to do the work prior to beginning a program, the
product development, we will have a far better informed,
independent as a well as a service estimate.
So, I do not think the answer, sir, is structurally to set
up an independent CAIG or to do something along those lines. I
think it is to ensure that we have more information on the
products before we get into the product development.
The competitive prototyping, the insistence upon technology
maturity are the kinds of information that we need to have.
Otherwise we are using old information, parametrics. Well, we
did it this way a couple of years ago, so we will use that kind
of way. Or circuit boards cost $1.98 in 2005, so assume they
cost $1, you know, $2.05. We need to have better information
than that.
That is why I think, yes, the CAIG estimate, the service
estimates are critically important. But to improve them, what
we need to do is improve the information that goes into them.
Mr. Sullivan. But in order to improve that information, you
might need an independent assessor overseeing all that----
Mr. Sestak. You could be joined and do both.
Mr. Sullivan. This could be a chicken and egg thing. I do
not know.
Mr. Andrews. I would like to thank Mr. Sestak, particularly
for the observation about the dichotomy between the jointness
in operations----
Mr. Sullivan. Yes.
Mr. Andrews [continuing]. But the lack of jointness in
procurement. I think it is a very critical point----
Mr. Sullivan. Yes, very----
Mr. Andrews [continuing]. That goes to a lot of what Mr.
Ahern and Mr. Sullivan said. Thank you.
I am going to ask if Mr. Conaway has any concluding
remarks.
Let me also mention that, if any member of the panel would
like further analysis or questions, please submit them in
writing. I am sure the witnesses would comply with a written
answer.
Mr. Conaway.
Mr. Conaway. Well, witnesses, thank you very much for
coming this morning at an unusual hour for a hearing like this.
I appreciate that.
I would like to follow up with you with respect to these
CAIG estimates and the ones that are actually used, and what
requirement there is for reconciling the differences between
the two on the front end.
Mr. Conaway. And then, Mr. Sullivan, you may have some
historical data about tracking reasonable estimates, CAIG
estimates and reality.
Mr. Sullivan. Yes, I do.
[The information referred to can be found in the Appendix
beginning on page 78.]
Mr. Conaway. It might be helpful for us to look at it.
But in closing, it is actually the panel's responsibility
to find legislative issues that need to be addressed on a go-
forward basis. So, if there is legislation that you see is
needed to help what we are trying to do and trying to
accomplish, which I think all of us have the exact same goal,
please point that out to us, as well, because that is really
the goal of what we are trying to get done.
Chairman, thank you.
Mr. Andrews. Thank you.
I would like to thank the witnesses for their outstanding
preparation. We are going to call on you as the process goes
forward, I am sure.
There are two items I would ask you to supplement the
record with.
Mr. Ahern, I am interested in your views on how we might
eliminate the wiggle room that has been identified in the 5000
guidance. I know you are working very avidly on that. We would
be interested in your direction to us on how that process can
be sharpened and improved. I thought your comments this morning
were very edifying.
[The information referred to can be found in the Appendix
on page 77.]
Mr. Sullivan, I would like your views on the question of
the SARs. The selective acquisition reports that come over are
an excellent tool for the Congress to evaluate problems. As I
understand it--and correct me if I am wrong--I think all the
SARs come post-milestone B. Am I right about that?
Mr. Sullivan. Yes.
Mr. Andrews. Yes. I am interested in whether there is a
similar analytical tool that could be created by legislation
pre-milestone B, so that we could get an early warning signal
that there is something wrong in the design phase.
Now, I understand it is a whole different set of questions,
but I think that a SAR-type tool pre-milestone B would be very
useful for us. I would be interested in your thoughts.
And you, as well, Mr. Ahern.
[The information referred to can be found in the Appendix
beginning on page 77.]
Mr. Ahern. All right.
Mr. Andrews. The committee is going to--the panel is going
to proceed after we return from the recess with another
hearing. We will be consulting with the minority as to what the
topic ought to be with respect to that.
I am confident that we are going to broaden the issue of
defining what I began this morning talking about, the delta,
the gap between what we are paying for and what we are getting,
beyond the issue of the major weapons systems, as well. I think
we have had an excellent discussion of that subject, but we
want to go beyond that to the budget, the huge--the majority of
the procurement budget that is not major weapons systems.
You have given us some very sobering news this morning,
nearly $300 billion in overruns. And I think the good news we
hear is that there are tools in place for us to understand the
causes of these problems, which then gives us the ability to
find solutions.
What is disturbing, of course, is that the trending is in
the wrong direction. If you look at the difference between 2003
and 2008, I think your data are accurate. The problem is
getting worse and not better. And I think that goes to Mr.
Conaway's point, and Mr. Cooper's point earlier. You know, the
ream upon ream of assessment of this has gotten us nowhere--
worse than nowhere.
So, we are interested in trying to put teeth into the
decision-making process in a way that makes this work.
Again, the record will be open for any member to submit
more questions to the panel.
We thank both panelists for excellent presentations this
morning, and we stand adjourned.
[Whereupon, at 8:54 a.m., the panel was adjourned.]
?
=======================================================================
A P P E N D I X
April 1, 2009
=======================================================================
?
=======================================================================
PREPARED STATEMENTS SUBMITTED FOR THE RECORD
April 1, 2009
=======================================================================
[GRAPHIC] [TIFF OMITTED] T1761.001
[GRAPHIC] [TIFF OMITTED] T1761.002
[GRAPHIC] [TIFF OMITTED] T1761.003
[GRAPHIC] [TIFF OMITTED] T1761.004
[GRAPHIC] [TIFF OMITTED] T1761.005
[GRAPHIC] [TIFF OMITTED] T1761.006
[GRAPHIC] [TIFF OMITTED] T1761.007
[GRAPHIC] [TIFF OMITTED] T1761.008
[GRAPHIC] [TIFF OMITTED] T1761.009
[GRAPHIC] [TIFF OMITTED] T1761.010
[GRAPHIC] [TIFF OMITTED] T1761.011
[GRAPHIC] [TIFF OMITTED] T1761.012
[GRAPHIC] [TIFF OMITTED] T1761.013
[GRAPHIC] [TIFF OMITTED] T1761.014
[GRAPHIC] [TIFF OMITTED] T1761.015
[GRAPHIC] [TIFF OMITTED] T1761.016
[GRAPHIC] [TIFF OMITTED] T1761.017
[GRAPHIC] [TIFF OMITTED] T1761.018
[GRAPHIC] [TIFF OMITTED] T1761.019
[GRAPHIC] [TIFF OMITTED] T1761.020
[GRAPHIC] [TIFF OMITTED] T1761.021
[GRAPHIC] [TIFF OMITTED] T1761.022
[GRAPHIC] [TIFF OMITTED] T1761.023
[GRAPHIC] [TIFF OMITTED] T1761.024
[GRAPHIC] [TIFF OMITTED] T1761.025
[GRAPHIC] [TIFF OMITTED] T1761.026
[GRAPHIC] [TIFF OMITTED] T1761.027
[GRAPHIC] [TIFF OMITTED] T1761.028
[GRAPHIC] [TIFF OMITTED] T1761.029
[GRAPHIC] [TIFF OMITTED] T1761.030
[GRAPHIC] [TIFF OMITTED] T1761.031
[GRAPHIC] [TIFF OMITTED] T1761.032
[GRAPHIC] [TIFF OMITTED] T1761.033
[GRAPHIC] [TIFF OMITTED] T1761.034
[GRAPHIC] [TIFF OMITTED] T1761.035
[GRAPHIC] [TIFF OMITTED] T1761.036
[GRAPHIC] [TIFF OMITTED] T1761.037
[GRAPHIC] [TIFF OMITTED] T1761.038
[GRAPHIC] [TIFF OMITTED] T1761.039
[GRAPHIC] [TIFF OMITTED] T1761.040
[GRAPHIC] [TIFF OMITTED] T1761.041
?
=======================================================================
WITNESS RESPONSES TO QUESTIONS ASKED DURING
THE HEARING
April 1, 2009
=======================================================================
RESPONSES TO QUESTIONS SUBMITTED BY MR. ANDREWS
Mr. Ahern. The version of DoD Instruction 5000.2 just issued in
December 2008 eliminates much of what others refer to as ``wiggle
room'' compared to the 2003 version. I view the Department's task now
as one of the Components properly executing their acquisition programs
in accordance with DoDI 5000.2 and the OSD staff ensuring we undertake
disciplined, thorough program and milestone reviews before allowing
programs to proceed to the next acquisition phase. We will monitor how
the new guidance is being applied and, if necessary, will issue policy
changes to sharpen and improve the process. If any of our changes
require new legislation, the Department will submit a legislative
provision for your consideration. [See page 28.]
Mr. Ahern. What you suggest is a SAR-like submission during the
Technology Development phase of the acquisition process. Technology
Development is a continuous technology discovery and development
process reflecting close collaboration between the S&T community, the
user, and the system developer. It includes significant competitive
prototyping that will inform us on the realism of requirements and the
maturity of technology. It will also significantly improve our cost
estimates for the Engineering and Manufacturing Development and the
Production and Deployment phases. However, at this point in the
acquisition process there is no clearly defined program to report on.
As such, an annual SAR-like submission that purports to provide SAR-
quality information for the technology development effort's life cycle
would have limited credibility or utility. There is an existing
certification requirement (10 U.S.C. 2366a) that does establish
expectations for system cost during technology development. Under that
provision, if the projected cost of the system, at any time prior to
Milestone B approval, exceeds the cost estimate for the system
submitted at the time of the certification by at least 25 percent, the
Milestone Decision Authority, in consultation with the Joint
Requirements Oversight Council, shall determine whether the level of
resources required to develop and procure the system remains consistent
with the priority level assigned by the Joint Requirements Oversight
Council. The Milestone Decision Authority may then withdraw the
certification concerned or rescind Milestone A approval if the
Milestone Decision Authority determines that such action is in the
interest of national defense. This is consistent with the iterative
nature of the Technology Development phase in assessing the viability
of technologies while simultaneously refining user requirements
allowing the Milestone Decision Authority to make an informed judgment
whether the priority warrants committing to a higher-cost program. Once
the program has been initiated and receives Milestone B approval, the
Department establishes an Acquisition Program Baseline, holds the
program manager accountable for execution to it, and provides Congress
the SAR. [See page 28.]
Mr. Sullivan. A pre-Milestone B SAR-like report could be a valuable
tool for assessing whether a program is on track to have a solid
understanding of requirements, technology, and cost before formally
becoming an acquisition program. It could serve as the basis for
illuminating early trades in all areas (cost, schedule, requirements,
and technology) and as an early warning mechanism to identify programs
that are proceeding without the requisite knowledge in those areas.
However, given that the pre-Milestone B technology development phase is
a time when cost, schedule, and performance trades should be
encouraged, we would not recommend using this reporting tool as a
baseline control mechanism to apply Nunn-McCurdy-like standards to
technology development costs. Annual reporting could begin at Milestone
A. Much like the current SAR, it could provide basic information about
the mission need the program fulfills the acquisition and technology
development strategies, the program's activities to date, and contract
performance. It could also provide information on the programs'
schedule, cost, performance, and knowledge that is tailored to the
early stages of the acquisition process. These data could include the
following:
Capability Need: A description of the capability need that
justifies the program, including the following:
capability gap that needs to be filled
priority level assigned by JROC to this capability need
timeframe in which the overall capability is required
type of materiel solution preferred--information system
solution, evolutionary development of an existing capability, or a
transformational approach
Analysis of Alternatives (AOA): A description of the assessment and
results, including the following:
the scope of alternatives considered in the AOA
the recommended solution derived from the AOA
the technical, operational, and programmatic risks
identified with the recommended solution
Schedule: A baseline estimate set at Milestone A and current
estimates for the completion of the following:
Systems engineering reviews: System Functional Review,
System Requirements Review, Software Specification Review, Preliminary
Design Review
Technology development: Technology Readiness Assessment,
Prototype Demonstration (start and completion)
Requirements: Capability Development Document
Development cycle: Estimated cycle time in months
(Milestone A to B), Estimated cycle time in months (Milestone B to C)
by increment of capability (if applicable)
Cost: A baseline estimate set at Milestone A and current estimates
in base year and then year dollars for the following:
Cost estimate for Milestone B through completion reported
as a range of likely costs
Performance: Prioritized list of Key Performance Parameters that
includes:
Proposed performance baseline at Milestone A
Current estimate of performance
Level of performance that will be demonstrated in the
Technology Development Phase
Level of performance that has been demonstrated in the
Technology Development Phase
Critical technologies that are enablers for each Key
Performance Parameter
Description of requirements that were added or removed
during the Technology Demonstration Phase
Technology knowledge: List of the program's critical technologies
that includes:
Milestone A, current, and projected Milestone B
technology readiness levels
Most current test environment (lab, relevant,
operational)
Most current physical status (breadboard, functional
prototype, full-up prototype)
Description of trades available if technology does not
mature as planned (use an existing technology, reduce or defer
requirements, etc)
Schedule for maturing technologies to TRL level 7 (i.e.
demonstrated in a realistic environment)
Design knowledge: Current estimates of the following:
Total and projected number of drawings released by the
Preliminary Design Review
Estimated size of the software development effort (in
lines of codes)
[See page 28.]
______
RESPONSE TO QUESTION SUBMITTED BY MR. CONAWAY
Mr. Sullivan. DOD policy requires the CAIG to prepare an
independent life cycle cost estimate for a major defense acquisition
program Milestone B decision. The policy states that the Milestone
Decision Authority shall consider the CAIG estimate before approving
the program to start system development. It does not require a
reconciliation of the CAIG estimate with other service or program
office estimates.
In a July 2008 report, GAO found that program cost estimates are
often significantly understated--a finding consistent with cost growth
patterns reported by RAND, the Institute for Defense Analysis (IDA),
and other organizations that conduct defense analyses.\1\ In that
report, GAO analyzed the cost of 20 major defense acquisition programs
through December 2007. While the CAIG estimates generally
underestimated costs by a smaller amount than program office and
service estimates, the CAIG estimates could underestimate a program's
costs by billions of dollars (see table 1). For example, the initial
service estimate for the development of the Marines' Expeditionary
Fighting Vehicle was about $1.1 billion. The CAIG estimated the
development cost of the program to be $1.4 billion, but the expected
development costs for the program had grown to close to $3.6 billion.
In the case of the Future Combat System (FCS), the Army's initial
estimate for the development cost was about $20 billion, while the
CAIG's estimate was $28 billion. DOD began the program using the Army's
estimate of $20 billion, but development costs for FCS had grown to an
estimated $28 billion. Many programs are also approved to start
development based on the service or program office cost estimate rather
than the CAIG estimate. Less than a quarter of the 48 programs in GAO's
2009 assessment of weapon system programs that provided data used the
estimate made by the CAIG as a basis for the program's baseline, while
almost 70 percent of the programs used the program office or service
cost estimate.\2\ [See page 27.]
---------------------------------------------------------------------------
\1\ GAO, Defense Acquisitions: A Knowledge-Based Funding Approach
Could Improve Major Weapon System Program Outcomes, GAO-08-619
(Washington. D.C.: July 2, 2008); Assessment Panel of the Defense
Acquisition Performance Assessment Project for the Deputy Secretary of
Defense, Defense Acquisition Performance Assessment Report (Washington,
D.C.: Jan. 2006): Defense Science Board, Defense Science Board Summer
Study on Transformation: A Progress Assessment (Washington, D.C.: Feb.
2006); RAND, Historical Cost Growth of Completed Weapon System Programs
(Santa Monica: 2006); and Institute for Defense Analysis, Costs Growth
in Major Weapon Procurement Programs, Presentation to the 38th Annual
DoD Cost Analysis Symposium (Williamsburg: Feb. 2005).
\2\ GAO. Defense Acquisitions: Assessments of Selected Weapon
Programs. GAO-09-326SP (Washington, D.C.: March 30, 2009).
[GRAPHIC] [TIFF OMITTED] T1761.043
RESPONSE TO QUESTION SUBMITTED BY MR. SESTAK
Mr. Ahern. The Department addresses Modeling and Simulation (M&S)
Verification, Validation, and Accreditation (VV&A) both in policy and
in guidance. On the policy side, DoD Instruction 5000.2, ``Operation of
the Defense Acquisition System,'' addresses M&S as part of an
integrated test and evaluation (T&E) continuum that includes
developmental, operational, and live fire T&E; family-of-systems
interoperability testing; and information assurance testing. The Test
and Evaluation Strategy requires empirical data to validate models and
simulations and expects reconciliation of pre-test predictions with
post-test results. The policy also provides for the use of accredited
models in support of developmental T&E, initial operational T&E, and
live fire T&E. DoD Instruction 5000.59, ``DoD Modeling and Simulation
(M&S) Management,'' establishes the M&S Steering Committee that
oversees the development of VV&A policies, plans, and procedures. DoD
Instruction 5000.61, DoD ``Modeling & Simulation Verification,
Validation and Accreditation,'' establishes common-sense guidelines and
requires that models and simulations used to support major DoD
decision-making organizations and processes (e.g., the Defense
Acquisition Board; Joint Requirements Oversight Council; and Planning,
Programming, Budgeting, and Execution System) ``shall be accredited for
that specific purpose by the M&S application sponsor''. DoD Instruction
5000.61 also requires that VV&A be documented. In terms of guidance,
the Department has taken a number of steps. The Defense Acquisition
Program Support Methodology used in program support reviews includes
strong criteria for evaluating a program's VV&A efforts. We have an on-
line ``VV&A Recommended Practices Guide.'' A new military standard,
MIL-STD 3022, ``Documentation of VV&A for M&S'' was approved last year
and is already in use for acquisition purposes across the Department. A
DoD VV&A Documentation Tool automates production of the MIL-STD 3022
VV&A document set and became operational this year. We are also
developing risk-based VV&A guidelines and pursuing routine examination
of VV&A when M&S informs major acquisition decisions. The Department
does not keep central records of VV&A and no studies have been
performed to assess VV&A documentation. So, without a data call to the
DoD Components, it is not practical to provide a quantitative
assessment of overall DoD VV&A performance. The Department recognizes
that VV&A is important so that we have confidence in our models and
simulations. While VV&A is covered both in policy and guidance, we also
know that we need to continue working with the Components to ensure
models used in our decision-making processes are properly accredited.
[See page 24.]
NEWSLETTER
|
Join the GlobalSecurity.org mailing list
|
|