Early development The first high-activity radioactive material suitable for radiological warfare was produced in the reactor spent fuel of the
Hanford Site, during the
Manhattan Project. Over two months prior to the
Trinity test, a calibration test was carried out using an assembly similar to a
dirty bomb. On May 7, 1945, 108 tons of explosives dispersed a single slug irradiated at the Hanford Site to over 1,400 curies. . Prior to the
Normandy landings, members of the Manhattan Project anticipated a risk that the
German nuclear program had operational reactors and would use plutonium isotopes or
fission products from the spent fuel as a radiological weapon. The
Supreme Headquarters Allied Expeditionary Force authorized
Operation Peppermint, to develop and distribute
Geiger counters, film packets, and other radiation survey meters to detect radiological warfare. The United States pursued research into an offensive radiological weapons program in the post-war period. Supporters included
Ernest Lawrence and
Edward Teller.
Zirconium and
niobium radioisotope fission products were originally considered, but
tantalum-182 was concluded to be most effective. Inherently, a radiological weapons stockpile requires constant operation of
production reactors, to replenish the rapidly decaying weapon material. This came into conflict with the infrastructure requirements of the emerging nuclear industrial complex, which was demanding all US production reactor capacity for plutonium, but especially the short half-life polonium-210, at the time crucial for
neutron initiators. == Salted nuclear weapons ==