Construction of nuclear power plants in the United States has been almost totally dormant for many years. After a flurry of building in the 1960's and 70's, interest in new nuclear power plants reached a peak then declined through the 80's. Ultimately, construction of nuclear power plants effectively ended.
The reasons for this history are numerous and varied, but the overriding concerns that have impeded development of a nuclear power option in the U.S. have been excessive cost and indeterminate risk. The time that it takes to build a plant can run (unpredictably) 4-10 years or more, making financing very expensive and risk assessment extremely uncertain. However, despite the slow-down in nuclear plant construction, as of August, 2010, existing nuclear power plants contributed 19.2% of the power generated in the United States.
In fact, nuclear energy remains an attractive source of energy to many analysts, and it appears to be undergoing a modern renaissance of interest on the part of utility and governmental decision-makers. A nuclear power plant may be slow and expensive to get started, but the power that comes from it is carbon-free, steady and reliable. Furthermore, the cost of the fuel is low and its availability is highly dependable.
However, like all politically charged issues, there are two sides. Waste disposal may or may not be a solved technology. The possibility of nuclear proliferation can never be fully discounted, and terrorist attack or sabotage might occur, no matter how carefully a plant is safeguarded. There is also a significant use of carbon-based fuels in the mining and refining of uranium, the disposal of waste, and the decommissioning of old plants.