Find a Security Clearance Job!

Weapons of Mass Destruction (WMD)


Computers and Nuclear Weapons Design

YrSystemElapsed
Time
Personnel
Time
MIPSstoragespeed
1945IBM 6013 months9 months
1947IBM 6012 months5 months
1950IBM 6021.5 months2 months
1952MANIAC2 days2-3 days0.01180 K11 KHz
1954IBM 7011-2 days2-3 days
1974#30 min15 min
2010Inspiron 15 0.0112.3 GHz2.3 GHz
# As a matter of interest this entry hae been added to the Table while preparing the unclassified version of this account (1974) to indicate the further progress on this point since 1954. The major part of the elapsed time indicated here is required to prepare and check the input numbers, gain acceas to the machine, and wait while the printer lists the results.
SOURCE: LA~5647-MS "A Short Account of Los Alamos Theoretical Work on Thermonuclear Weapons, 1946-1950" Prepared by J. Carson Mark

The design adopted for the first hydrogen bomb did not come easily or quickly. Unlike fission weapons, scientists did not have a clear idea of the range of physical constraints governing thermonuclear weapon design. Extensive mathematical modeling and simulation were required. The need for better and better computers was compelling. During World War II, all computing at Los Alamos was done with desktop calculators and a variety of IBM business machines. Such machines were not capable of handling the complex modeling required for developing the hydrogen bomb.

Beginning shortly after the war, true computers started to become available, beginning with the ENIAC, IBM's SSEC, and the National Bureau of Standards SEAC. Because these machines were on the East Coast, many of the thermonuclear calculations actually took place far from Los Alamos. Although the first hydrogen bomb could have been developed without modern computers, such development would have been substantially delayed.

In 1946 the Hungarian/American mathematician John von Neumann was given the task of designing a powerful calculation machine to speed up the task. In 1950 a giant machine called MANIAC I was delivered. The Univac MANIAC I computer used to design hydrogen bombs) at Los Alamos, was capable of 11,000 operations / instructions a second. MANIAC I had a memory of 600 words, storage of 80K, 11KHz speed, and had 2,400 vacuum tubes. It was also programmable. The team that programmed MANIAC was led by Stan Ulam (1909-1984), who designed the H-bomb with Edward Teller.

Initially the " ... lack of a fast computer slowed investigation of fusion theory. Thus, in 1949, the Super represented pure fantasy. ... The computational work involved in analyzing the thermonuclear process was a colossal task, and could not have been accomplished without the great advances in computing equipment that took place in the late 1940's and early 1950's. The IBM 601 was considered quite a machine in 1945, but it was far surpassed by the Maniac, designed at the Institute for Advanced Study, Princeton, New Jersey, and which became available in 1952. A problem that would require 3 months work by the 601 could besolved in 2 days by the Maniac. It is of record that, in the course of running a thermonuclear problem on the Princeton Maniac in 1953, the number of basic arithmetical computations performed was of the same order of magnitude as the total number of similar operations performed at Los Alamos (excluding those done on the Los Alamos Maniac) in the entire 10 years of operation of the Laboratory." ["History of the Early Thermonuclear Weapons" RS 3434/10, Page 16] By 1964, CDC´s 6600 supercomputer, designed by Seymour Cray, performed up to 3 million instructions per second - a processing speed three times faster than that of its closest competitor, the IBM Stretch. The 6600 retained the distinction of being the fastest computer in the world until surpassed by its successor, the CDC 7600, in 1968.

In 1965, Intel co-founder Gordon Moore saw the future. His prediction, popularly known as Moore's Law, states that the number of transistors on a chip doubles about every two years. This observation about silicon integration, made a reality by Intel, has fueled the worldwide technology revolution. Microprocessor performance has scaled over the next three decades from devices that could perform tens of thousands of instructions per second to tens of billions of instructions per second in today's products. Processors have evolved from super-scalar architecture to instruction-level parallelism, where each evolution makes more efficient use of fast single instruction pipeline. By the year 2010, computer speed is measured in FLOPS, which usually mean floating point instructions per second. The term exaFLOPS describes the processing of one trillion - or 1 million million million - instructions per second. Supercomputers working at that speed are referred to as "exascale" systems.

In 2010 Dell's new InspironT 15 was a 15.6" laptop that gives everyday features. This bottom of the line laptop is a great value, for $399.99. The processor is an Intel® Pentium® Dual CoreTM T4500 (2.3GHz/800Mhz FSB/1MB cache). It comes with 2GB DDR2 memory (RAM) at 800MHz [included in Price], and can accomodated up to 8GB Shared Dual Channel DDR2 at 800MHz [add $275.00 to the price].




NEWSLETTER
Join the GlobalSecurity.org mailing list


Unconventional Threat podcast - Threats Foreign and Domestic: 'In Episode One of Unconventional Threat, we identify and examine a range of threats, both foreign and domestic, that are endangering the integrity of our democracy'


 
Page last modified: 06-01-2016 12:13:27 ZULU