CAPITALISM NATURE SOCIALISM
Past Events
 

Eco-Sufficiency & Global Justice - A new book by Ariel Salleh

Selections from CNS conference in Toronto 7/22-24/2005:

Towards Ecological And Social Justice In The North And In The South Italian CNS as a journal and a movement. Giovanna Ricoveri

Public Participation And Ecological Valuation: Inclusive = Radical Patricia E. (Ellie) Perkins

Comafrica Institute, in Brazil, is pleased to announce the new website www.comafrica.org

Visit the York University website to read papers that were presented at the CNS conference in Toronto. www.yorku.ca/cnsconf

Towards ecological and social justice in the North and in the South: Italian CNS as a journal and a movement - Giovanna Ricoveri

 

Commons vs. Commodities*

Giovanna Ricoveri  The Commons as an Alternative to Capitalism

In this paper, I propose that the subsistence commons of the past can form the backbone of an alternative social order to capitalism, which has dominated the world for the last three centuries. This is because they are based on cooperation, not on competition; they are jointly used, being neither private nor public property; they use natural resources sustainably; and they promote forms of direct democracy that integrate and reinforce representative democracy. In brief, they provide goods and services that do not become commodities to be exchanged on the capitalistic market.

I realize that my proposal may be perceived as unfair in the so-called less developed countries, engaged as they are in striving to reach Western living standards. I answer this question in two ways: first, my proposal addresses countries of the North, which use more than their share of natural resources at the expense of the South. Second, my proposal is paradigmatic, and therefore it concerns both the North and the South. Its purpose is to underline modes of production and ways of living that are ecologically sustainable and socially determined.

necessity vis-à-vis the crisis of the capitalist system, aimed at stopping the plunder of Nature and the disintegration of society, which in the South translates into hunger and death from hunger for more than 1 billion people. Climate change, privatization of natural resources and public spaces, new forms of poverty, unemployment of women and the young, social exclusion, food insecurity, and new diseases caused by pollution of water, air, and food chains are only some of the emerging features of capitalism in its present financial phase, which produces “paper” wealth while destroying real wealth.

In support of prioritizing subsistence commons, consider the following: first, evidence shows that at the beginning of the third millennium, the profit frontier has moved into natural resources and public goods to appropriate natural goods, infrastructures and services, which are both a gift of Nature and the result of human work and ingenuity by local populations and communities. It is a collective wealth that multinationals and financial capital try to appropriate by any means necessary through wars of both low and high intensity. It is therefore imperative for local communities to resist this trend and reclaim the common wealth in order to survive.

Second, although nobody would deny that life depends on “subsistence” goods and services, in Western cultures—which have spread to large parts of the South—it is taken for granted that basic goods and services are supplied by the market. The market is nonetheless unable to “produce” air, water and land, the basic goods essential to the lives of the poor and rich alike. The market is unable to allocate natural resources in such a way as to grant everybody, rich and poor, their share of water, air and land. Nor is the market capable of avoiding the wastage of natural resources, in spite of efforts made to regulate the market. The water shortage, for example, is indeed created by the capitalist market. There is no water shortage when water is managed by water communities, which still exist in the North and South, because they use techniques of rainwater collection and other traditional ways of conservation and utilization. All of these traditional methods exist thanks to the traditional knowledge coming from the work and inventiveness of all the present and previous members of the community.

Moreover, subsistence is socially and environmentally determined. Therefore, it changes over time and according to different places. Resources such as iron, on which industrial production mainly located in the West depends, could be considered part of subsistence now that iron ores are depleted and iron supply is insufficient to satisfy demand. On the other hand, it is true that iron substitution is possible, but only within certain limits. It is also true that iron is not renewable relative to human lifespans. If this way of doing things—formally advanced by several communities of the South—were to be followed, many of the environmental and social problems created by the capitalistic system would not exist, because communities use natural resources without depleting them.

The claim that subsistence or material commons exist only in so-called less developed countries of the South, and that non-material commons are mainly a problem of the industrialized countries of the North, is widely accepted in the West by scholars and activists. The evidence of all empirical studies on the commons [see in primis Elinor Ostrom’s (1990) work] shows that this belief is wrong. Moreover, such a notion is dangerous because it reinforces a false dichotomy between the North and the South, thus favoring ideologies of exclusion and racism.

In brief, it is a powerful device through which the North keeps its privileges and legitimizes the plunder of the South.The Historical Process of Delegitimation of the Commons

The underestimation of natural commons and the parallel overestimation of non-material commons prevailing in Western discourse is the consequence of a process that Carolyn Merchant (1980) has called the “death of Nature.” With the English Industrial Revolution of the 16th and 17th centuries, Nature ceased to be considered sacred and was deleted from the human horizon, to the point that people have lost even the perception of it. Slowly but progressively, Nature has been transformed into a deposit of lifeless resources, inputs for industrial production at the disposal of corporations.

The organic metabolism between people and nature, which had made possible the sustainability of the past, was substituted by an industrial metabolism between people and industrial production.
In the transition from the Middle Ages to Modernity, the commons became a hindrance to change and “progress,” and was therefore dismantled. In England, common lands were enclosed or privatized both to provide the wool that was needed as a raw material to the emerging textile industry and to free the labor necessary to run the new manufacturing in the cities. To carry out the Industrial Revolution, England conquered large parts of the Americas (the colonies) and deported millions of Africans to employ them as slaves in the sugar plantations. And the same imperial formation conquered and used much of South Asia as the cash cow for investments needed to develop technologies and techniques required for industrialization (De Cecco 1974; Frank 1978).

The social consequences of these events have been analyzed and criticized by scholars such as Karl Marx (in Capital) and Karl Polanyi (1944). Environmental consequences have emerged slowly and are still unfolding as in the cases of climate change, loss of biodiversity, and the emergence of new diseases resulting from pollution, among others (Crosby 2004). These processes have been reflected in ideological changes.

The formalization of political economy as a science at the end of the 17th century resulted from and contributed to the end of the social order preceding the Industrial Revolution. The theories of the founding fathers of political economy—Adam Smith, David Ricardo, and John Stuart Mill— justified the expansion of the capitalist market, which changed the course of history for the entire world. The invisible hand, the homo economicus and the theory of comparative advantages and free competition opened a new phase in ideology, wiping out even the memory of the commons, at first largely among intellectuals and the ruling classes of Western Europe. An imperative for vertical competition won over the necessity for horizontal competition.Characteristics of the Commons

Commons were the prevailing form of social organization in the European Middle Ages and are widespread even now in the South where village—often native—communities still exist. This is the case in most Sub-Saharan African countries, in all of the countries of Southeast Asia including China and India, and in the Andean countries of Latin America. The United Nations estimate that over two-thirds of the global population live in the countryside and nearby forests, where they survive thanks to the rural people’s direct access to subsistence resources.

The commons are jointly used resources, administered and self-managed by local communities. They are not just resources in the sense of physical entities, such as a piece of land to cultivate, a pasture, a pool of water, or a fishing area. The commons may also take the form of common rights to use the fruits of a given natural resource as in Anglo-Saxon common law, or “usi civici” in the Italian legal tradition, or that of the “claims” still weighing on natural goods that allow communities to survive or further their means of survival, or of genetic resources, which are considered in the 2001 International FAO Treaty on the Natural Vegetable Resources of Food and Agriculture.

The commons are hard to define because they vary in time and space. Their strength lies within diversity and specificity, i.e., in the ability of communities to adjust to different situations. However, it is possible to define their main characteristics, the first one being flexibility. Another is self-management by local communities, which indicates either a group of people jointly using a natural resource (e.g., a piece of agricultural land), or a village authority that allocates fertile lands among village families, with the provision of cultivating such land for family consumption, not for commercial ends.

The community functions according to a logic entirely different from that of the capitalist market, meaning that exchange is based on interpersonal relationships, not the impersonal exchange of equivalent things. It is also for this reason that community is a controversial concept, often rejected by cultures prevailing in the North, which identify community ties with blood and tribal boundaries, not with ties of proximity and solidarity. This, in any case, pretends that problems specific to Northern societies are representative of all humanity.

Another element of the commons is the joint use of natural subsistence resources, whose property is held neither in private nor public in the sense of belonging to the State. This is something difficult even to perceive in the West, particularly in countries with no tradition of common law, such as Italy. However, it has been pointed out that private property has prevailed for a limited and recent period of human history, while collective property is the original form of land tenure prevailing through the bulk of human history (Grossi 1981; Thompson 1993). In Western culture, Nature holds no rights, so as to avoid human rights over natural resources being rendered void by the withering of the physical base on which they rest, be it water, air, land, or fire/energy (see recent Constitutions passed in Bolivia and Ecuador).

The Market-State dichotomy, a founding principle of the market economy under capitalism, is now under attack due to the ecological crisis induced by the bad management of natural resources both by the State and by the Market. The Market-State dichotomy is also being questioned by scholars, who point out the role of modern Western States in privatizing resources and reducing political democracy to parliamentary democracy. Citizen participation in public choices is not a price to pay, but a resource to utilize (Mattei and Nader 2008).

Whatever their geographical configuration or the historical period considered, the commons represent a system of social relations based on cooperation and reciprocity. They provide sustenance, security, and independence yet do not produce commodities. They express a productive and social order based on cooperation, not on competition (The Ecologist 1992).

The commons are an institution that has survived through time, in spite of the enclosure to which they have been exposed over the past several centuries. They survive because they are both flexible to adjust, and they embody inalienable human rights, spaces of self-organizing that satisfy the need of social relations embedded in human nature. In brief, they express a mode of social organization alternative to that of homo economicus, as theorized in mainstream economics.

The commons cannot be alienated since communities are not the proprietor of the resources on which the commons subsist; and this holds even when products are exchanged between communities. When the question arises as to whom natural resources belong, the answer is “nobody” since natural resources and ecosystem services of Nature are a free gift to all beings, human and non human.

In feudal Europe, land property and other natural resources on which communities made their living belonged to the “prince” (the aristocracy and the Church), who was also the judge sitting in courts to solve conflicts over the commons. Conversely, in the countries of the South, the property of those natural resources traditionally belonged to local communities, who were the village authorities. All that changed with the State and private property, competing to appropriate the commons.

Local communities fought back, and in some cases succeeded to keep their rights over the resources. But generally speaking, the enclosure went on and made the remaining commons appear as an unwanted legacy of the past, something irrelevant that could be done away with. This holds for all types of commons—water, forests, fishing rights, jointly run agricultural fields. Natural catastrophes such as the 2011 To¯hoku earthquake and tsunami in Japan, British Petroleum’s devastating oil spill in the Gulf of Mexico in 2010, or the 2004 tsunami in Southeast Asia show that it is wrong to consider the commons as something belonging to the past; but this fact does not seem to be enough to stop prevailing trends.

The commons are also ecological and cultural systems. They are the foundations of life since they supply essential goods such as water, food, shelter, fuel and medicines. These are goods that the capitalist market can supply only in part, and only as commodities to buy on the market under prices and conditions that consumers are forced to accept without any control over the allocation of natural resources, nor over prices and the quality of the final products.

The distinction between local and global commons, often used in the literature, is not well founded since “the global is always a globalized local.” The global system today that governs the world is not universal in any epistemological way; rather it is the globalized version of a local tradition—usually of Western European origin—that has been able to impose itself violently on the rest of the world. As Vandana Shiva (1993) points out: “The construction of the global is responsible for the destruction of the environment, i.e., of resources with which local populations survive…it is the political tool with which the dominant forces escape their responsibilities, letting them fall over local communities.”

To conclude this point, the commons are local systems, geographically diverse even in the same historical period of time. It is exactly for this reason that they represent a realistic alternative (but not the only alternative) to the paradigm of the market. Their diversity and flexibility allow for the best resource use, and for avoiding over-exploitation, deterioration and destruction, which are inevitable in the capitalist system. Moreover, they promote human creativity, intelligence and energy, which are the most scarce, yet most important resources of a society that must be ecologically and socially sustainable.The New Enclosures

Climate change is one of the most important enclosures to date, caused by the emission of greenhouse gases due to fossil fuel combustions and deforestation. The atmosphere—once a commons everybody could use—has now been appropriated by oil, coal, energy, steel, cement and automobile corporations to discharge polluting products generated by their production processes. To this end, they discharge in the atmosphere a quantity of gases greater than what the atmosphere can absorb. The amount of CO2 produced by fossil-fuel energy has deprived human beings, animals and plants of their share of clean air, giving rise to global climate changes for which the poor pay the higher price, even though they are least responsible. The market mechanism of CO2 shares defined by the 1997 Kyoto Protocol—according to which polluters may buy pollution credits from those emitting less CO2—creates a second level of enclosure of the atmosphere. This second level of enclosure gives property rights on the atmosphere to those who pollute more than others—multinationals and the rich countries of the North—for free.

The loss of biodiversity and the patents of seeds and knowledge necessary for their conservation and improvement are another important case of new enclosures, carried out at the expense of peasants, natives and local communities, who were their keepers for centuries and millennia. Biodiversity changes and evolves through time: the genetic heritage now existing on Earth is the result of evolutionary processes spanning 3.8 billion years. The Millennium Ecosystem Assessment, one of the most complete and reliable studies of the earth ecosystems published by the World Resources Institute in 2005, claims that human impact has fundamentally modified biodiversity, to some extent irreversibly, and mostly in terms of species losses.

In the last few decades, the slow decline of biodiversity has become a worrisome trend. The 2008 Living Planet Report compiled by the World Wildlife Fund (WWF) indicates that in the last 35 years the world population doubled while animal populations decreased by one-third, and the areas covered by virgin forests (where most of biodiversity is located) shrank by 50 percent. The WWF report, which covers 1,680 animal species, reveals a loss of biodiversity equal to 28 percent, with a peak of 35 percent in freshwater ecosystems, 44 percent in the dry lands, and up to 51 percent in the tropics. FAO estimates confirm that 75 percent of crop varieties are already lost and that out of 30,000 edible species, only 30 contribute to food requirements for 95 percent of the world population. It is therefore likely that thousands of other varieties will be lost in the next few decades.

Patents for seeds and traditional knowledge are the other face of the same coin. One of the most controversial steps of the enclosure process is the Papal Bull emitted in the year 1500, which allowed Christopher Columbus to conquer the Americas. The 1995 agreements on international trade—the World Trade Organization (WTO) and Trade Related Intellectual Property Rights (TRIPs)—consider seeds and plants “intellectual property rights,” i.e., a product of the mind, not an important component of Nature. Agreements on international commerce have thus cancelled the common rights of peasants, natives and local communities to plant their seeds, obliging peasants to buy seeds patented by multinationals, who in turn demand royalties and, for example, force Africans with HIV/Aids to buy patented drugs from companies whose products ultimately come from biopiracy.

World hunger is one of the most serious and immoral problems of the last few decades. It has several causes, each one tied to one or more processes of enclosure of a common. First and foremost, it has to do with land enclosure in the rural economies of the South. Other structural causes are water scarcity, climate change, interferences in hydrological cycles, loss of land fertility, industrial agriculture, feed vs. food competition, and the monocultures of basic crops (in the South) to serve as fuel for automobiles (in the North). The 2009 FAO report on world hunger estimates that more than 1 billion people are hungry and that a few million—the majority of them landless peasants in the South—die from hunger each year, while the number of Northern overweight people in the North increases each year. According to the FAO, the geographical distribution of the hungry is: 642 million in Asia and the Pacific, 265 million in Sub-Sahara Africa, 53 million in Latin America and the Caribbean, 42 million in the Near East and North Africa, and 15 million in developed countries.

Water privatization has taken on new connotations in the last years and decades, one of which is particularly odious—the construction of mega dams. The phenomenon is not new but has become more serious after the Second World War, thanks to the World Bank’s policy in the field of energy. The World Bank and governments are convinced that large-scale dams are necessary to produce energy and sustain industrial development. But such dams are a serious risk to a population’s security. They exact heavy and devastating environmental and social costs, such as the creation of numerous refugees and dislocated local people, who are excluded from the political decision to build the dam and from the fruits of its work. The energy produced by the dams is used by big farming and manufacturing corporations, not by the displaced peasants.

The number of dams 15 meters high and above has increased tenfold in nearly 50 years from a little more than 5,000 in 1950 to almost 50,000 by the end of the last century, with most of them located in the South (25 percent in China). Put under trial since the 1980s by international public opinion and popular movements everywhere, particularly in countries such as India where the mammoth Sardar Samovar Dam was under construction on the Narmada river (Roy 1999), construction of mega dams continue, although at a slower pace. Part of this trend is evinced in the Three Gorges Dam—dubbed the Great Wall of the 21st century—built on the Yangtze River in China. Inaugurated in 2006 and not yet finished, this dam is 185 meters high and has a water reservoir 600 km long, the largest in the world. It has taken thirteen years to build at a cost of about €25 billion. It has already submerged thirteen large cities and 116 urban centers, transforming more than 1 million people into refugees.

The privatization of the sky is another of the new enclosures. Around Earth there is now a flood of technological tools for telephone, television, computer networks, and other means of communication, for vehicles parked in orbit over people’s heads, and for military and civilian airplanes releasing heavy quantities of greenhouse gases. All these activities (sometimes illicit or covered in military secrecy) are a source of great profit for multinationals and governments, since they use a common good free of charge—space—at the expense of people’s health and security. Civilian air traffic is an important aspect of this problem, since air transport has grown quickly. And it could grow even more with the introduction of biofuels, which are incorrectly described as emitting net zero CO2. When the entire production cycle of biofuels is taken in consideration—from the cutting of virgin forests to the cultivation of monocultures of soybeans, sugarcane and palm oil, to the likely additional expansion of air traffic to an increasing numberof new airports—the claim that biofuels are carbon free is clearly untrue.

The lack of maintenance of the territory produces heavy consequences, such as increased desertification and soil erosion. It therefore facilitates so-called natural catastrophes, which are not natural but socially determined, as in the case of landslides that occur when it rains more than expected. This lack of maintenance worsens the consequences of natural catastrophes, as in the 2004 Southeast Asia tsunami, where almost 300,000 people died, many of which could have been rescued if the sea coasts had been protected by mangroves instead of having been built over. Many of the negative consequences of a catastrophe could be avoided or dramatically reduced with the appropriate maintenance of a territory to protect coasts, govern rivers and the flux of water, avoid deforestation so that agricultural and forest biomass performs its role of countering soil erosion and landslides, as well as absorbing CO2 and other greenhouse gases. For all these reasons, the lack of maintenance of a territory can be included among the new enclosures, whose effect is to appropriate the means of subsistence belonging to local communities.The Return of the Commons: A Proposal

The crisis of global capitalism, which appeared in all its depth with the defaults on subprime loans in the U.S.A., has accentuated the crisis of politics and that of political parties as the privileged subjects of politics. The demand that political parties make room for movements as the new subjects for alternatives is more and more frequent in the West. What movements are we talking about? This is difficult to answer given the differences existing among countries, in general and specifically between the North and South, the most important differences being the rule of the law, the concept of democracy, and the role of political parties. The rule of the law existing in the North is a Western ideology, which justifies the role of the West’s hegemony over the world. What is needed as an alternative is a strategic vision and a comparative knowledge of different systems, which is not available at the present. The case of Italy can, to some extent, represent Western European countries, although I am aware of the limits from Western culture within which this proposal is formulated.

In today’s Italy, many subjects are a legitimate part of various movements. They include all the workers in factories in crisis, the unemployed young, groups of citizens fighting against urban environmental destruction, and all organizations and associations that experiment with new ways of producing and living. Other subjects are comprised of local governments, parts of the trade unions, some trade and cultural organizations, entrepreneurs running out of steam, and groups competing on the market in ways that differ from plunder. Subjects like these exist everywhere in the North under forms specific to each country, but all try to open a new public space.

The cultural context within which the movements for alternatives can work is determined by the limits of Nature and natural resources. To follow this perspective, it is necessary to realize an ecological conversion of markets and productions, i.e., to “territorialize” markets and productions, starting from the most sensible, such as the automobile—in Italy as well as in all industrial countries. The automobile as it is now has no future, both because of pollution and because it doesn’t serve the purpose of mobility. Another sector of the economy and society needing quick conversion is that of energy, which must opt for renewable sources. The technology exists, but not the political will. Another priority is industrial agriculture, which has to make space for peasant small-scale agriculture, using traditional techniques free of agrochemicals. The fourth priority is the maintenance of the territory. The fifth is the use and reuse of metals and other minerals, both because some of them are scarce—or have become scarce due to over-exploitation—and because metal and mineral extraction is damaging to the Earth. Other priorities are water, public services, and so on.

The list is long. What needs to be stressed here is that the conversion to sustainable resource use that I am talking about is not planned by the Western State, centralized and bureaucratic. It is instead carried out by movements and communities at the local level: at the scale of single factories, of farm, in the districts of a town, in the town. The planning we need results from thousands of initiatives, not from the State monopoly.

Although the overall damage caused by financial capitalism goes beyond the individual advantages that the system once granted (at least to a part of the world population), the critique of the system hasn’t so far produced any alternative. This is due mainly to strong resistance by the prevailing ruling class, which takes advantage of its position and is able to lie and make it appear as if its lies were true. Another reason—the decisive one—is the absence of leadership with a strategic vision, one that is capable of mobilizing the people.

In this context, the strategic questions to face are many, and all are as heavy as stones:
First, it should be agreed that Keynesian politics and policies are outdated and that the present crisis cannot be dealt with by raising demand and public expenditure as President Roosevelt did to deal with the Great Depression. The present crisis is caused by speculative finance, environmental disruption and bad politics, which were not as profound during the thirties as they are today. Furthermore, the same level of market globalization, privatization of nature, and commodification of life in all its aspects did not exist at that time.

Second, it would be necessary to agree that the State-Market dichotomy has become inadequate, that the subjects of change are several, and that there is no longer a single historical subject for an alternative system—neither the proletariat, nor the working class, nor the enlightened bourgeoisie, and even less the multitude.

Third, in the process of change and territorialization, local communities or movements have a central role to play. This is a point to be debated to avoid the charge that the return to the commons is a return to the past.

Fourth, the word “progress” is also out. Too often it has been used to justify the greatest injustices of the last century, starting with the war. Another word that is out is “development,” though “progress” will be even more difficult to excise, even in the scholar’s discourse.

Fifth, it should be recognized that politics is discredited. This can only be overcome through local communities exercising self-government to decide over local resources and local questions related to their territory. This change is necessary, but not simple, and history doesn’t help. However, the movement has never been global historically, and this can make the difference.

The “return” of the commons, as I have called my proposal, goes beyond the reclamation of the commons. It is a proposal to be considered according to the above strategic perspective. It requires us to consider environmental movements both as instances that represent the needs of the people living in actual ecosystems and as a world movement imposing political change and new forms of direct democracy.

Today, the movements of the North are at a strong disadvantage since only governments—state or local—have title to decide, and this limits the democratic participation of citizens over matters that directly concern them. The present setup, according to which common goods are part of the public trust, is no guarantee against their erosion and privatization. This is also because corruption of public power and bad politics are always lurking in the background.

Therefore, I end by posing two last challenges: First, my proposal needs to be problematized so as to compensate for natural, historical and geopolitical differences among places, since communities—as people—are not necessarily good-hearted and just.

Second, the return of the commons will not be a real alternative to capitalism, unless the new communities are united among themselves and open to the world, or better still, cosmopolitan.

References

Cosby, A. 2004. Ecological imperialism: The biological expansion of Europe, 900-1900. Cambridge: Cambridge Univ. Press.
DeCecco, M. 1974. Money and empire: The international gold standard, 1890-1914. Oxford: Oxford Univ. Press.
Frank, A.G. 1978. World accumulation. New York: Monthly Review Press.
Grossi, P. 1981. An alternative to private property. Chicago: Chicago Press.
Mattei, U. and L. Nader. 2008. Plunder: When the rule of the law is illegal. New York and Cambridge: Cambridge Univ. Press.
Merchant, C. 1980. The death of nature: Women, ecology, and the scientific revolution. San Francisco: Harper & Row.
Ostrom, E. 1990. Governing the commons: The evolution of institutions for collective action. Cambridge: Cambridge Univ. Press.
Polanyi, K. 1944. The great transformation. New York: Holt, Rinehart & Winston.
Roy, A. 1999. The greater good. Delhi: IBD.
Shiva, V. 1993. The greening of the global reach. In Global ecology, W. Sachs, ed. 150. London: Zed Books.
The Ecologist. 1992. Special Issue: Whose Common Future? 22 (4).
Thompson, E.P. 1993. Customs in common. New York: The New Press.

 

The following images are from a series of INTERSECTIONS paintings by Dave Channon. Dangerously invasive and fabulous endangered species in stark contrast to the built environment, calling into question the balance of life on Earth.    See more of Dave Channon's art at www.EsopusCreek.com

 

Asian Longhorn Beetle

 

 

Gypsy Moth

 

 

Emerald Ash Borer

 

Kraken the Atom by Dave Channon

Nuclear Damage Control
By Karen Charman   WhoWhatWhy.com 2/10/2012

What if you were promoting an industry that had the potential to kill and injure enormous numbers of people as well as contaminate large areas of land for tens of thousands of years? What if this industry created vast stockpiles of deadly waste but nevertheless required massive amounts of public funding to keep it going? My guess is that you might want to hide that information.
From the heyday of the environmental movement in the late 1960s through the late 1970s, many people were openly skeptical about the destructive potential of the nuclear power industry. After the partial meltdown at Three Mile Island in central Pennsylvania in March 1979 and the explosion of Chernobyl’s unit four reactor in the Ukraine in April 1986, few would have predicted that nuclear power could ever shake off its global pariah status.
Yet, thanks to diligent lobbying efforts, strong government support, and a full public-relations blitz over the past decade, the once-reviled nuclear industry succeeded in recasting itself in the public mind as an essential, affordable, clean (low carbon emission), and safe energy option in a warming world. In fact, the U.S. Nuclear Regulatory Commission (NRC) has just cleared the way for granting the first two licenses for any new reactors in more than 30 years. The new reactors will be built at the Vogtle plant in Georgia, southeast of Augusta.
Even so, the ongoing crisis following meltdowns in three of the six reactors at the Fukushima Daiichi nuclear complex in Japan nearly a year ago has shined an unwanted spotlight on the dark side of nuclear power, once again raising questions about the reliability and safety of atomic reactors.
In response, the nuclear industry and its supporters have employed sophisticated press manipulation to move the public conversation away from these thorny issues. One example is PBS’s recent Frontline documentary, Nuclear Aftershocks, which examines the viability of nuclear power in a post-Fukushima world.
What follows is a detailed critique of many of the issues raised in the program, which initially aired January 17, 2012.
***
In the program, NASA’s celebrated chief climate scientist, James Hansen—who has a penchant for getting arrested protesting the extraction and burning of the dirtiest fossil fuels—says that the Fukushima accident was “really extremely bad timing.” Though it was at the end of a statement about the harm of continuing to burn fossil fuels, Hansen’s comment begs the question: Is there ever a good time or place for a nuclear catastrophe?
Under the cloud of what some experts believe is already worse than Chernobyl, the nuclear industry and its supporters are scrambling to put as good a face on the Fukushima Daiichi disaster as possible.
Fukushima’s triple meltdowns, which are greatly complicating and prolonging the cleanup of the estimated 20 million metric tons of debris from the 9.0 earthquake and subsequent tsunami last March, present a steep public relations challenge.
The strategy seems to be: 1) to acknowledge the undeniable—the blown-up reactor buildings that look like they were bombed in a war, the massive release of radionuclides into the environment, the fact that tens of thousands of people have been displaced from their homes and livelihoods, and that some areas may not be habitable for generations, if ever. But then, 2) after coming clean about those harsh truths, downplay or dismiss the harm of the ongoing radiation contamination, invoking (irrational) “fear” as the much greater danger. And 3) frame discussion of the need for nuclear power in the even scarier context of global warming-induced catastrophic climate change (this despite the irony that the reality of global warming is still rejected by fossil fuel industry partisans and growing numbers of the public who have been swayed by the industry’s media-amplified misinformation). Whether consciously or not, Frontline’s Nuclear Aftershocks adheres to this PR strategy.
The program begins with a harrowing view of nuclear power at its most destructive. Viewers see close-ups of the three destroyed Fukushima Daiichi reactors with the tops of their buildings blown off amidst the wreckage around the plant. Real time video captured on cell phones shows the precipitating earthquake, and there is film of the ensuing tsunami that engulfed the plant.
Frontline also captures the dystopian scene of an utterly destroyed landscape littered with seemingly unending tracts of twisted and broken buildings, infrastructure, and the various trappings of modern Japanese life—much of it now radioactive detritus. A member of the Japanese Atomic Energy Commission who toured the plant six weeks after the beginning of the disaster sums it up with this simple comment: “This scenery is beyond my imagination.”
Frontline clearly explains how, without electricity to run the valves and pumps that push water through the reactors’ cooling systems, the intensely radioactive and thermally hot fuel in three of the six General Electric Mark 1 boiling water reactors (BWRs) then in operation quickly began to melt. (Loss of all electricity is one of the most dangerous situations for a nuclear reactor, and is known as a station blackout.) This in turn led to a build-up of hydrogen, which is highly combustible, in the reactor buildings where any small spark could—and did—trigger explosions.
“It was an unprecedented multiple meltdown disaster,” Frontline correspondent Miles O’Brien reports. “For the first time since the Chernobyl accident in 1986, large quantities of dangerous radioactive materials—about one-tenth of the Chernobyl release—spewed into the atmosphere from a stricken nuclear power plant.”
As bad as that was, O’Brien says the problems for plant owner Tokyo Electric Power Company (Tepco,) were only just beginning. That’s because Tepco had to try to keep the reactors cooled with enough water in order to prevent the absolute worst, what is popularly but misleadingly referred to as “The China Syndrome.”
According to nuclear engineer Arnie Gundersen, a China Syndrome accident is a three-stage progression. In stage one, all of the fuel inside a reactor melts and turns into a blob at the bottom of the reactor core (the “meltdown”). In stage two, the molten radioactive blob eats through the nuclear reactor vessel (“a melt-through”), which in the case of GE Mark 1 BWRs is an eight-inch steel encasement. Housing the reactor vessel is the containment structure, three feet of concrete lined with two inches of steel. If the melted nuclear fuel were to bore through that and hit the natural water table below the plant, it would result in a massive steam explosion that would send most of the reactor’s deadly contents into the air, where they would disperse far and wide.
Although CUNY physics professor Michio Kaku said on ABC’s Nightline, that Tepco’s efforts were “like a squirt gun trying to put out a forest fire,” the company was able to get enough water in to keep the fuel cool enough to prevent the absolute worst case.
Gundersen says that was the good news.
The bad news is that the water that has come into direct contact with the melted fuel in the three destroyed reactors (including water that is still covering them) is leaking out the side through cracks in the containment structures, filling other buildings at the plant, and seeping down into the groundwater below and around the plant and directly into the Pacific Ocean. Frontline acknowledges the problem, pointing out that because of the high levels of radiation, it will be “a long time” before the site is decontaminated enough for anyone to be able to get inside the reactor to see exactly where the cracks are and to fix them.
As significant a problem as this ongoing contamination is, the biggest discharges of radioactivity into the Pacific—considered the largest ever release of radioactive material into the sea—occurred within the first seven weeks of the accident. At its peak concentration, cesium-137 levels from Fukushima were 50 million times greater than levels measured before the accident, according to research by Woods Hole Oceanographic Institution chemist, Ken Buesseler and two Japanese colleagues.
It’s impossible to know exactly how much radioactivity contaminated the Pacific or what the full impact on the marine food chain will be. A preliminary estimate by the Japan Atomic Energy Agency reported in the Japanese daily Asahi Shimbun in October said that more than 15 quadrillion becquerels of radioactivity poured into the ocean just from the Fukushima Unit 1 reactor between March 21st and April 30th last year. (One quadrillion equals 1,000 trillion.)
A report in January in the Montreal Gazette noted that Japanese testing for radioactive cesium revealed contamination in sixteen of 22 species of fish exported to Canada. Radioactive cesium was found in 73 percent of the mackerel tested, 91 percent of the halibut, 92 percent of the sardines, 93 percent of the tuna and eel, 94 percent of the cod and anchovies, and 100 percent of the carp, seaweed, shark, and monkfish. These tests were conducted in November and indicate that the radioactivity is spreading, because tuna, for example, is caught at least 900 kilometers (560 miles) off shore.
Real Health Concerns or Just Fear?
In summing up the disaster, Frontline’s O’Brien says: “The earthquake and tsunami had stripped whole towns from their foundations, killing an estimated 18,000 people. Life is forever changed here.”
But then he shifts from documenting the undeniable devastation to speculating on how big a problem remains: “[T]he big concern remains the radioactive fallout from the Fukushima nuclear explosions. People here are fearful about how much radiation there is, how far it has spread, and the possible health effects.”
Japanese citizens have decried their government’s decision to allow radiation exposures of up to 20 millisieverts a year before ordering an evacuation. O’Brien equates this level with “two or three abdominal CAT scans in the same period” but nevertheless characterizes it as “conservative.” What follows is his exchange with Dr. Gen Suzuki, a radiation specialist with the Japanese Nuclear Safety Commission.
MILES O’BRIEN: [on camera] So at 20 millisieverts over the course of a long period of time, what is the increased cancer risk?
GEN SUZUKI, Radiation specialist, Nuclear Safety Comm.: Yeah, it’s 0.2— 0.2 percent increase in lifetime.
MILES O’BRIEN: [on camera] 0.2 percent over the course of a lifetime?
GEN SUZUKI: Yeah.
MILES O’BRIEN: So your normal risk of cancer in Japan is?
GEN SUZUKI: Is 30 percent.
MILES O’BRIEN: So what is the increased cancer rate?
GEN SUZUKI: 30.2 percent, so the increment is quite small.
MILES O’BRIEN: And yet the fear is quite high.
GEN SUZUKI: Yes, that’s true.
MILES O’BRIEN: [voice-over] People are even concerned here, in Fukushima City, outside the evacuation zone, where radiation contamination is officially below any danger level.
Missing from the above exchange is both established and emerging radiation biology science, as well as the fact that radiation exposure is linked to numerous other health problems from immune system damage, heart problems and gastro-intestinal ailments to birth defects, including Down’s syndrome.
Gundersen points out that, according to the U.S. National Academy of Sciences 2006 BEIR report (BEIR stands for Biological Effects of Ionizing Radiation), an annual exposure of 20 millisieverts will cause cancer in one of every 500 people. Since this is an annual exposure rate, the risk multiplies with each year of exposure. So, for example, five years of exposure to 20 millisieverts will result in an additional cancer in one in 100 people.
Gundersen notes that the risk is not the same for all population groups. According to Table 12-D in BEIR VII Phase 2, the younger the person exposed, the greater the risk of cancer.
Girls are nearly twice as vulnerable as boys of the same age, while an infant girl is seven times and a five-year-old girl five times more likely to get radiation-induced cancer than a 30-year-old male. Using BEIR’s risk data, one in 100 girls will develop cancer for every year that they are exposed to 20 millisieverts. If they are exposed for five years, the rate increases to one in twenty.
New radiobiology science shows even more cause for concern. Numerous studies of nuclear workers over the last six years—including one authored by 51 radiation scientists that looked at more than 400,000 nuclear workers in 15 countries—found higher incidences of cancer at significantly lower exposure rates than what Japan is allowing.
This finding is important because it challenges the application of the highly questionable data from the Japanese atom bomb survivors that authorities use to set radiation exposure limits.
Nuclear reactors emit low doses of radionuclides into the air as part of their normal operation. Because workers are generally exposed to repeated low doses over time, compared to an initial very high dose from a nuclear bomb, this data is a much more accurate predictor of radiation-induced cancer in people in fallout zones, or downwind of nuclear reactors, than records of Hiroshima and Nagasaki survivors.
Despite the fact that the National Academy of Sciences accepts that there is no safe dose of radiation, nuclear proponents have long insisted that low doses provided very little, if any, risk from cancer. (Some even say it’s beneficial.)
But new evidence shows otherwise. Chromosomal translocations (or aberrations), a kind of genetic injury that occurs when DNA molecules damaged by genotoxic chemicals or radiation don’t properly repair themselves, are well documented in cases of medium to high radiation exposure. Chromosomal translocations are also known to increase the risk of many forms of cancer.
Until recently, it wasn’t clear whether low-dose exposures caused chromosomal translocations. A 2010 study looking at the impact of medical X rays on chromosomes not only found that this chromosomal damage occurs with low dose radiation exposure, but that there were more chromosomal translocations per unit of radiation below 20 millisieverts (the Japanese limit) and—surprisingly—“orders of magnitude” more of this kind of damage at exposures below 10 millisieverts.
Frontline’s complacent assessment of the “small increment” of increased cancer risk to Japanese citizens from the ongoing Fukushima fallout contrasts sharply with an assessment by the Canadian Medical Association Journal. That peer-reviewed journal quotes health experts who say the levels of radiation the Japanese government has set before requiring evacuation, combined with a “culture of cover-up” and insufficient cleanup, is exposing Japanese citizens to “unconscionable” levels of radiation.
CMAJ notes that instead of expanding the evacuation zone around the plant to 50 miles, as international authorities have urged, the Japanese government has chosen to “define the problem out of existence” by raising the allowable level of exposure to one that is twenty times higher than the international standard of one millisievert per year.
This “arbitrary increase” in the maximum permissible dose of radiation is an “unconscionable” failure of government, contends [chair of the Medical Association for Prevention of Nuclear War, Tilman] Ruff. “Subject a class of 30 children to 20 millisieverts of radiation for five years and you’re talking an increased risk of cancer to the order of about 1 in 30, which is completely unacceptable. I’m not aware of any other government in recent decades that’s been willing to accept such a high level of radiation-related risk for its population.”
Frontline’s take epitomizes a longstanding pattern of denying radiation health effects, even in the most dire nuclear disasters (though Fukushima is arguably the most dire to date) and blaming it on the victims’ personal habits or their levels of stress from fear of radiation. This was done to the victims of the March 1979 accident at Three Mile Island in central Pennsylvania, to Chernobyl victims, and it is happening again with Fukushima.
Nuclear TINA
But what about alternatives? Are there any, or does Margaret Thatcher’s famous slogan regarding capitalist globalization, “There Is No Alternative” (TINA) apply?
Frontline answers this question by going to Germany, where correspondent O’Brien probes the German psyche in an attempt to learn why nuclear power elicits such a strong negative reaction there.
He questions several German citizens, including an adorable little boy, on why they are so afraid of nuclear power. He speaks with the head of the German government committee tasked with considering how to phase out nuclear power, as well as a German energy economist, who says the decision is not likely to change.
And he expresses astonishment that an industrial nation the scale of Germany has decided to shut down all seventeen of its reactors, which account for 23 percent of its electricity generation, within a decade.
Standing in a field that he identifies as the world’s largest solar farm with solar panels as far as the eye can see, O’Brien says Germans support this “seemingly rash decision” because they have faith that there is an alternative.
He then informs viewers that over the past 20 years, Germany has “invested heavily in renewables, with tax subsidies for wind turbines and solar energy,” adding, “It’s kind of surprising to see [the world’s largest solar farm] in a place like this with such precious little sunshine.”
Though he says there is plenty of wind, he characterizes Germany’s target of producing 80 percent of its energy from renewable sources by 2050 as a “bold bet” whose success will depend on technological breakthroughs to store enough wind or other renewable energy (presumably through improved battery technology) so that it can provide a steady source of power. He notes that the steady production of power is something “nuclear energy does very well.”
Atomiconomics
Any honest discussion of nuclear power—especially when raising the issue of tax subsidies and other government support for renewable sources like wind and solar—must include information on the many hundreds of billions of dollars of public support thrown its way. Despite the highly publicized recent bankruptcy of Solyndra, this support dwarfs what has been given to renewables.
In the executive summary to his February 2011 report on nuclear subsidies, energy economist Doug Koplow says the “long and expensive history of taxpayer subsidies and excessive charges to utility ratepayers…not only enabled the nation’s existing reactors to be built in the first place, [they] have also supported their operation for decades.”
Every part of the nuclear fuel chain—mining, milling and enriching the uranium fuel; costs associated with the construction, running, and shutting down and cleaning up of reactors; the waste; and even the lion’s share of the liability in the case of an accident—has been subsidized to one degree or another.
Koplow says that because the value of these subsidies often exceeded the value of the power produced, “buying power on the open market and giving it away for free would have been less costly than subsidizing the construction and operation of nuclear power plants.”
One of the most important gifts to the nuclear industry is the pass on financial responsibility for a serious accident. This was legislated during the Cold War in the Price-Anderson Act of 1957. In fact, without this protection, it’s highly unlikely the commercial nuclear power industry could or would exist.
In a recent article in the Bulletin of the Atomic Scientists arguing for the end of Price-Anderson, nuclear industry economic analyst Mark Cooper points out that 50 years ago General Electric and Westinghouse, the two largest reactor manufacturers, said they wouldn’t build reactors without it.
Although Price-Anderson was initially rationalized (along with many of the other subsidies) as necessary protection to help get the fledgling industry going, Congress has repeatedly renewed it over the years.
Today, reactor owners have to carry a small amount of private insurance, and Price-Anderson creates an industry-wide pool currently valued at around $12 billion. Accounting for inflation, Cooper puts the estimated costs of Chernobyl in excess of $600 billion. In Japan, the Fukushima accident is projected to cost up to $250 billion (though it could well be more). Here in the U.S., Cooper says, a serious accident at, say, Indian Point, just 35 miles north of Manhattan, could cost as much as $1.5 trillion.
If such an accident were to happen in the U.S., taxpayers would be left with the tab for the difference.
But even with all of the subsidies, the cost of building a new reactor—pegged at between $6 billion and $12 billion apiece—is still so expensive that reactors only get built with substantial government help.
To jumpstart a new round of nuclear construction, the Obama administration is trying to offer $54.5 billion in loan guarantees (only $18.5 billion is actually authorized by Congress). This means that if a project is delayed or cancelled for some reason—including for concerns over safety—taxpayers pick up the tab for that delay or cancellation.
Although the U.S. Department of Energy is expected to approve $8.3 billion in loan guarantees for the two new reactors at the Vogtle plant in Georgia any day now, significant concerns remain over the lack of transparency regarding the federal loan guarantees.
Besides the massive federal subsidies, the nuclear industry has also succeeded in getting three states so far, South Carolina, Georgia, and Florida, to pass legislation mandating “advanced cost recovery.” This allows nuclear utilities to collect the cost of building a reactor from their customers before it is built.
Advanced cost recovery programs have existed in the past, but Morgan Pinnell, Safe Energy Program coordinator at Physicians for Social Responsibility, says the new ones the nuclear industry is pushing are particularly irresponsible from a public-interest point of view.
For example, in December 2011, a resolution was offered to the St. Petersburg City Council to repeal the 2006 legislation, F.S. 366.93, citing, among other things, that the two reactors that Progress Energy proposed for Levy County would raise Progress Energy customers’ bills more than $60 a month. Even if the reactors are never built, it’s not clear whether the utility would have to pay the money back.
Are Nukes Green?
Back in the 1980s, when nuclear power was widely considered a pariah, growing concern about global warming in government circles provided an opportunity for the beleaguered industry. Since it was recognized that nuclear power plants, unlike coal plants, did not produce carbon emissions when generating electricity, the UN International Atomic Energy Agency and some policymakers began to promote nuclear energy as a necessary power source in a warming world.
By the early nineties, the nuclear industry began casting itself as the clean, green “fresh air” energy source, a description that goes unchallenged in today’s mainstream media. Towing this line, Frontline’s Nuclear Aftershocks argues that nuclear power is needed to combat climate change.
It bears asking how true, or even realistic, this claim is. In order to avoid the most catastrophic effects of global warming, many climate scientists have been saying for at least the better part of a decade that by 2050 humanity needs to reduce global carbon emissions 80 percent from what was emitted in 2000.
An MIT task force report, The Future of Nuclear Power, written ostensibly to figure out how to do that, calls for 1,000 to 1,500 thousand-megawatts electric (MWe) capacity reactors to be up and running by 2050 to increase the share of nuclear-generated electricity from 20 percent to 30 percent in the U.S. and 17 percent to 20 percent globally. (Currently there are 435 reactors operating in the world and 104 at 60 different locations in the U.S.)
The first page of the executive summary of the report says that such a deployment would “avoid 1.8 billion tonnes of carbon emissions from coal plants, about 25 percent of the increment in a business-as-usual scenario.”
But displacement of 25 percent of the expected growth in carbon emissions does not square with the need to cut emissions by 80 percent by 2050. That aside, the 2009 update of the report notes that progress on building new reactors has been slow, both globally and in the U.S.
The 2003 report reveals another hitch in this plan: in order to deal with the nuclear waste from that many new reactors, an underground repository the size of the highly controversial and cancelled Yucca Mountain would have to be built somewhere in the world every four years. It bears noting that we are in the sixth decade since commercial nuclear power generation began and not one permanent repository has been completed anywhere in the world.
Some people are calling for fuel reprocessing, which takes spent nuclear fuel and uses a chemical process to extract plutonium and uranium to make more nuclear fuel. Aside from the fact that reprocessing wouldn’t actually reduce the volume of spent nuclear fuel very much, it’s dangerous, expensive, and irresponsibly polluting (the West Valley reprocessing plant in Western New York, which ran for six years between 1966 and 1972, is still a huge toxic mess).
Reprocessing also creates lots of weapons-grade plutonium that can be made into atomic bombs, a feature that one might question in our increasingly tense and politically unstable world.
Other nuclear enthusiasts see a magic bullet in thorium reactors, but according to a 2009 Department of Energy study, “the choice between uranium-based fuel and thorium-based fuels is seen basically as one of preference, with no fundamental difference in addressing the nuclear power issues.”
One specific design, the “liquid fluoride thorium reactor, or LFTR (pronounced “lifter”) has attained cult status as a “new, green nuke” that its promoters say will produce a virtually endless supply of electricity that is “too cheap to meter” in “meltdown proof” reactors, creating miniscule quantities of much shorter-lived waste that is impossible to refashion into nuclear bombs.
But critics say these claims are fiction. Thorium technology is significantly more expensive than the already exorbitant uranium-fueled reactors, so there are serious doubts it could ever be commercially viable without much higher subsidies than the nuclear industry already receives.
There are also serious safety concerns with reactors that run on liquid fuel comprised of hot, molten salt, as the LFTR design does.
Ed Lyman, senior scientist in the Global Security program at the Union of Concerned Scientists, says a small prototype of the LFTR that operated at the Oak Ridge National Laboratory in the 1960s remains “one of the most technically challenging cleanup problems that Oak Ridge faces.”
Nukes in a Warming World
The need for nuclear power has been sold to the public as a way to prevent the existential threat of catastrophic climate change. But that argument can be turned the other way. In a world of increasingly extreme weather events, we need to question the wisdom of having more potential sources of widespread, deadly radiological contamination that could be overwhelmed by some Fukushima-style natural disaster.
In a presentation to the San Clemente City Council, home of the troubled San Onofre nuclear power plant, which is right on the Pacific Ocean halfway between Los Angeles and San Diego, nuclear engineer Arnie Gundersen points out that U.S. nuclear plants are designed to meet whatever industry designers think Mother Nature is expected to throw at them. This requirement—their “design basis”—is found in the Nuclear Regulatory Commission’s 10 CFR Part 50, Appendix A, No. 2.
Different locations have different risks, so the requirements for plants vary. For example, nuclear plants in California are designed to be able to withstand stronger earthquakes than, say, the reactor in Vermont. Likewise, plants built in Florida are designed to handle more severe hurricanes than plants in upstate New York.
The requirements are set for a one-in-a-thousand year event. Considering that four events exceeded the design basis of nuclear reactors in the past year—the 9.0 To¯hoku earthquake in Japan, the tsunami that followed, the flooding of the Missouri River around the Ft. Calhoun nuclear plant in Nebraska, and the 5.8 earthquake centered near the North Anna plant in Virginia (two of which resulted in disaster)—how confident can we be that either nuclear operators or the NRC have anticipated the worst nature can throw at us?
Using the thousand-year scenario, Gundersen points out that for any one reactor running for 60 years, there’s a 6 percent chance that it will see an event as bad as or worse than what it was designed for. Multiplying that 6 percent by the 60 nuclear plant locations bumps it up to a 360 percent chance.
“In other words,” Gundersen says, “it’s a near certainty that some plant in the U.S. over its lifetime will experience an event worse than designers had anticipated. As a matter of fact, it’s more like three or four plants…”
As the impacts from global warming worsen, the risks will undoubtedly increase.
Consider that 2011 broke all records for billion-dollar weather disasters in the U.S. AP science writer Seth Borenstein recently described it this way: “With an almost biblical onslaught of twisters, floods, snow, drought, heat and wildfire, the U.S. in 2011 has seen more weather catastrophes that caused at least $1 billion in damage than it did in all of the 1980s, even after the dollar figures from back then are adjusted for inflation.”
But it wasn’t just the U.S.: 2011 also saw record-breaking extremes all over the world throughout the year. Ross Gelbspan, whose 1997 book The Heat is On chronicled the fossil fuel lobby’s remarkably successful campaign to deceive the public and derail any action to address global climate destabilization, catalogues a hefty list of meteorological calamities from floods, torrential rains and massive mudslides, colossal snowstorms, ripping windstorms, and tornadoes to withering heatwaves, droughts, and wildfires here and here.
With or without nuclear power, the escalation of global warming isn’t likely to slow any time soon. Though a recent discovery of the effectiveness of polyethylemimine at capturing CO2 sounds promising (researchers say it sequesters carbon at large industrial sources, small individual sources like car exhausts, and can even pull it directly from the air), it remains to be seen how quickly scrubbers from this material can be manufactured and deployed and how well they will actually work.
In any case, fossil fuel companies are doubling down on their pursuit of “unconventional” fossil fuels like natural gas from shale, coalbed methane, and tight gas sands (fracking), and oil from deepwater wells and tar sands—all in all, the dirtiest (in terms of greenhouse gas and other pollution), riskiest, and most energy-intensive sources.
And in the absence of policies to reduce greenhouse gases, the U.S. Energy Information Administration’s International Energy Outlook 2011 projects global coal use to rise 50 percent between 2008 and 2035 from 139 quadrillion Btu to 209 quadrillion Btu.
Despite the increasing urgency to tackle global warming, the most recent global climate talks in Durban failed to reach agreement on extending the Kyoto Protocol, which laid out the world’s only legally binding (but subsequently ignored) carbon emissions reductions.
It’s time to reexamine a lot of the assumptions that lurk beneath the nuclear-power-is-necessary-to-deal-with-climate-change narrative. There was no mention in Frontline’s Nuclear Aftershocks program or any other mainstream media that I have seen about the big elephant in the room: the voracious energy-gobbling economy—which creates the need for enormous, centralized power sources—that’s making the planet (and us) sick.
When junk-food addicted smokers get diabetes, cancer, heart disease, or any number of other maladies considered “lifestyle diseases,” the admonishment that they need to change their lifestyle is typically accepted without question.
We would do well to start applying that same logic to the way our societies use energy and the kind of economy such energy use powers, rather than blindly accept the Hobson’s choice of either turning the Earth into Venus because of global warming or poisoning large swaths of it with radioactivity.


 

 

 

HOME>>>    Website Design & Hosting by Esopus Creek Multimedia - Contact Webmaster