Nuclear Energy: Friend or Foe?

By Sam Bertini, CEO of Tabr

Nuclear energy has once again found itself in the spotlight following Iran’s controversial decision to continue enriching Uranium despite widespread global appeals. Nuclear energy is mired in controversy, and for good reason — it has had an exceedingly controversial past. The groundwork for the discovery of fission was laid out by Henri Becquerel’s discovery of radioactivity in 1896 and the subsequent research into radioactive elements he conducted alongside Pierre and Marie Curie, which led to the three scientists being awarded a Nobel Prize in Physics in 1903. Several decades later, in 1938, German scientist Otto Hahn discovered fission when he created the element Barium by bombarding Uranium with neutrons, thereby rendering it unstable and forcing its nucleus to split roughly in half. Hahn’s discovery prompted widespread interest in the international scientific community, leading to an explosion of research into the mechanisms and applications of artificially-induced nuclear fission. 

Hungarian scientist Leo Szilard conceived the idea of a nuclear chain reaction, in which a single fission reaction could sustain others through the release of additional neutrons. This mechanism of chain reaction could be exploited to generate previously-inconceivable amounts of energy. Szilard, worried that the Germans may discover the applications of fission, convinced Albert Einstein to send a letter to President Franklin Roosevelt in 1939 detailing the potential creation of a powerful Uranium bomb. Roosevelt, who found Einstein’s words to be of considerable “import,” ordered research into the creation of such a weapon, before formally authorizing the Manhattan Project (a research effort based in Tennessee with the goal of creating nuclear weapons) in 1942 . The first nuclear bombs were not detonated until a few years later in 1945, when a prototype bomb called ‘Trinity’ was detonated in a desert in New Mexico. Mere weeks later, ‘Little Boy’ (Uranium-235) and ‘Fat Man’ (Plutonium) were detonated over the Japanese cities of Hiroshima and Nagasaki, killing many tens of thousands of civilians, and forever changing the face of war. 

The controversy does not end here — although nuclear energy has not been used in warfare since the end of the Second World War (despite several close calls), it has caused both death and destruction after a series of high-profile meltdowns, from the cautionary tales of meltdowns at Three Mile Island and Chernobyl to the much more recent Fukushima incident in 2011. Nuclear energy’s short but eventful history as both a weapon of war and a proposed solution to the energy crisis has seen it burned into the collective consciousness as something to be feared. It is for this reason, perhaps, that it is often seen as a relic of an era that never was — and, in the views of skeptics, one that should never be. 

Nuclear research has progressed significantly in recent decades, especially with regards to our ability to harness the power of nuclear fusion. Dozens of nations (including the US, China, and Russia) are involved in a collaborative attempt to produce the world’s first fully-functional fusion generator — this project has been ongoing since 2005, and assembly of the generator is slated to begin in 2018. Considering recent advancements, should we be so quick to dismiss nuclear energy? 

It may not be prudent to do so, as fusion has already done more for humanity than any other form of energy. As theoretical physicist Daniel Clery so eloquently states, “We owe everything to fusion…without it, the Cosmos would be dark, cold, and lifeless”. To see fusion’s life-granting abilities in action, one needs to look no further than our Sun, which has been fusing hydrogen for billions of years, and will likely continue fusing hydrogen until long after the last human being has paid it any notice. Unlike fission, fusion does not produce “extremely harmful” radioactive waste — when deuterium and tritium (isotopes of hydrogen) fuse under extremely high temperatures (over 100 million degrees Celsius), only non-toxic helium gas is produced. 

Because fusion requires such high temperatures to occur, any malfunction in a fusion generator would cause the fusion process to halt, meaning there is no risk of a chain reaction causing a potentially disastrous nuclear meltdown, as has happened in several fission plants. In addition, whilst the Uranium used in fission is in relatively limited supply, Earth’s stores of hydrogen (the most abundant element in the Universe) are virtually inexhaustible. Nuclear fusion has the potential to produce energy on a scale that is unprecedented in human history. Given access to such power, how would we use it? 

If the events at the end of the Second World War (and the many close calls since then) are any indication, it is not inconceivable that the power of nuclear fusion could be used for destructive purposes. In the past, warfare was waged with manpower as the prime driving force — more often than not, the force with the greater number of soldiers would prevail. The advent of nuclear technology, however, has enabled smaller nations to have a disproportionate sway in geopolitics — for example, up until very recently, North Korea has been expressing ‘warlike rhetoric’ to threaten neighbors South Korea and the United States. 

From a weapons perspective, nuclear technology is the end of the historical progression of creating technologies that make the act of killing easier: from swords to guns to missiles, nuclear bombs represent the ultimate evolution of a killing device. In addition to being far more effective at killing, nuclear weapons introduce a sort of detachment from warfare that has been increasing throughout recent history. Hundreds of years ago, the killing of an enemy soldier might have involved looking him in the eye while piercing him with a sword. Later, technological advancements in weaponry gave soldiers the ability to end a life by merely pulling a trigger. That is not to say, however, that this advancement makes killing easy. In fact, the effects of killing and warfare on the mind have been studied significantly in recent decades, and data suggests that there is still much psychological impact on soldiers at war. 

Following the Vietnam War, there has been a larger appreciation of how warfare negatively affects the mind. Nearly one in five Vietnam veterans suffered from lifetime PTSD, and many more suffered from depression, alcohol abuse, and personality disorders, potentially culminating in suicide. There has even been data that suggest that the neurobiological effects of warfare are transmittable to offspring, meaning the psychological effects of warfare do not end with the death of a soldier. The use of nuclear weaponry is the ultimate in personal detachment from warfare — with the press of a button, the deaths of hundreds of thousands can be orchestrated with minimal involvement from soldiers. Nuclear missile stores are likely to increase, but at present there are enough nuclear weapons to ‘kill millions of people and flatten dozens of cities’.  This comes with many implications, including increased ease of killing in warfare, lower costs of initiating warfare, greater hesitation to engage, and reduced psychological considerations for soldiers.

Since the unparalleled bloodshed of World War I and World War II, the life of a soldier — and, indeed, human life in general — has become less expendable. We have seen the effects of war, and thus there is a greater hesitance to send soldiers to their deaths. This does not necessarily mean that governments would have cause to use nuclear weapons as a substitute. Governments are well aware that the use of nuclear weapons would bring about mutually assured destruction — as such, there has been extreme hesitance to make use of nuclear stockpiles since 1945. Memories of the Cold War, where “the ever-present threat of nuclear annihilation” prompted people to build “bomb shelters in their backyards,” still linger in both America and post-Soviet States. 

So too do memories of Hiroshima and Nagasaki, even if — as John Berger argues in “Hiroshima” — they are “mostly suppressed”. The bombings, which Berger refers to as “evil,” “shocked and surprised” the world in the days that followed. Those unfortunate enough to be caught in the blast did not necessarily die painlessly — uncountable innocents suffered “long, lingering deaths” after falling victim to severe radiation poisoning. Those who survived did not escape unscathed, either. The ‘lucky’ ones, if you can call them that, were thereby referred to as “those who have seen hell.” It is not so unreasonable to say that superpowers armed with nuclear weapons have learned from the consequences of the events of 6th August, 1945. After all, there has been considerable public backlash to the horrors that nuclear weaponry may cause — this may have contributed to the lack of nuclear warfare since then. 

Furthermore, North Korea’s Supreme Leader Kim Jong-un — perhaps the greatest nuclear threat the world has seen since the Cold War — recently promised a new “age of peace” in Korea. Although it is difficult to take his promise at face value, the declarations of “never again” following the bombings of Hiroshima and Nagasaki now seem like less of a pipe dream than they did before the collapse of the USSR, or even mere weeks ago. The immediacy and finality of the loss of life when the bomb fell on Hiroshima awoke a newfound appreciation of the value, and certainly the fragility, of life. In the past, hundreds of thousands would die in war campaigns that lasted years: to see similar losses in human life within minutes was unfathomable, and incited a wisdom of repugnance that still remains to this day. Perhaps people are less blind to the horrors of Hiroshima than Berger believed.

North Korea’s strides towards military denuclearization may herald the beginning of a new nuclear age — an age in which developments in nuclear technology are used not to attack and destroy but to preserve and protect. Our planet, and by extension our species, is certainly in need of better protection than is being currently afforded. We rely on fossil fuels for over 80 percent of our current energy needs, while global demand for energy is touted to increase by over 50% in the coming decades. Stores of fossil fuels are expected to be exhausted within decades, and once they’re gone there is no going back (not for millions of years, at least). To further compound the looming crisis, the sources of energy we do use contribute to climate change the likes of which has not been seen in modern history. There is reason to believe that nuclear fusion — and to a lesser extent, fission, if proper safety and disposal methods are realized — has a vital part to play in combating the situation we face. Fusion is both inexhaustible and does not contribute to environmental degradation — as a result, it is well-placed to be the foremost source of energy production for the future. 

Fusion’s exciting potential, however, is often tempered by skepticism. Unfortunately, fusion is far from a quick fix. Some UK scientists have been critical of ITER’s slow progress and its use of ‘outdated, costly technology’ (Austin). In addition, they believe (contrary to ITER’s proposed timeline) that the first fusion plant will not be operational until 2050, by which fossil fuels may well have been exhausted and irreparable damage done to the atmosphere. Although its shortcomings have seen it labeled as the ‘most expensive technological failure of the modern age,’ investment in fusion is not optional — it is all but mandatory. It’s proposed ‘green competitors’, including solar, wind, and geothermal power, are simply not as reliable an option. A reliance on such energy would put the demands of our civilization at the mercy of weather patterns, and as such they are no true solution to the crisis we face. In addition, when accounting for actual energy production, there is no comparison: the average solar panel produces in the realm of 250 Joules per second, while a single gram of hydrogen could produce 1012 Joules of energy per second — that is four billion times what a solar panel would produce in ideal conditions. 

Perhaps fusion’s success as an energy source to date has been limited, but that doesn’t mean that it should be abandoned — especially when our survival as a species (not to mention the survival of our planet) is contingent upon sustainable energy production. We need not rely on the Sun and the unpredictable weather patterns it creates in order to harness energy. Not when we can create many thousands of miniature Suns right here on Earth to produce all the energy we could ever conceivably need.

Nuclear fusion has given life to the Universe — without it, there would be no light, no stars, no planets. When nuclear fusion becomes an impossibility, the Universe will go dark, extinguishing all life permanently. Fusion is neither good nor evil: such labels need not apply, because fusion is one of the basest forces of nature, uncaring for human life or perception. The fact that we are so close to being able to harness the power of fusion is a testament to human ambition and achievement, and the cynicism that perpetually shrouds nuclear energy is largely unwarranted. Granted, nuclear weaponry will not be disappearing any time soon, and its presence is surely cause for concern. However, we have learned much from Hiroshima — the sudden loss of hundreds of thousands of innocents caused something of a paradigm shift when it comes to our perception of human life. We are not expendable, and killing with such reckless abandon is not tolerable. 

What truly killed the innocents was not the fission of molecules in Uranium but the human beings who decided that countless lives were discardable. It is for this reason that to distrust nuclear energy is to distrust humanity, and thus dismissing fusion on the grounds of military application would be short-sighted. We desperately need an inexhaustible, eco-friendly source of energy capable of sustaining human life for thousands (and perhaps more, if we are lucky) of years to come. Currently, there is no alternative to fusion, and it is doubtful that we will find one, considering that it is the process that powers the most energy-intensive objects in the Universe. If fusion bombs flatten cities, then it would not be right to blame nuclear technology. If fusion generators save us from the damage we have done to the Earth and end the energy crisis, then it would not be right to credit nuclear technology. Ultimately, how we make use of fusion (and how we don’t make use of it) says more about us than anything: in this sense, the access to such energy provides us with a mirror unto ourselves.