30 Years after Chernobyl, Public Split on Nuclear Power

Today marks the 30th anniversary of the most catastrophic nuclear disaster in history. On April 26, 1986, one of the four reactors at Ukraine’s Chernobyl nuclear plant exploded, spewing radiation into the atmosphere.

The disaster occurred following an “experiment” to determine whether the cooling pump system could function using low reactor power (in the event of an electricity failure). For the experiment, staff lowered additional control rods into the reactor core to reduce output to 20%. But they lowered too many and output dropped rapidly—to the point of almost complete shutdown. To counteract the drop, the staff raised more and more rods until, unexpectedly, power levels surged to 10+ times the normal level! Two explosions followed, rupturing the containment vessel and causing a fire that lasted 9 days.

The explosions released over 5% of the reactor core into the atmosphere, contaminating large swaths of Ukraine, Belarus, Russia, and Scandinavia. Within 3 months, at least 31 people succumbed to acute radiation sickness and over 350,000 fled their homes. Another 4000+ people contracted thyroid cancer in ensuing years. (Going forward, experts believe that long-term radiation exposure will cause 9000 – 93,000 additional cancer deaths).

These horrific effects led to a burst of activity to improve atomic safety. Among other things, the nuclear community created the World Association of Nuclear Operators to review 430 reactors worldwide for problems. The International Atomic Energy Agency “beefed up” its role as UN nuclear watchdog and expanded its safety standards. Moreover, countries signed various international agreements, including the Convention on Nuclear Safety.

These efforts were “critical”. Yet, they have failed to prevent all further nuclear accidents. In March 2011, for example, the second largest nuclear disaster occurred at Japan’s Fukushima-Daiichi power plant. In this case, a massive earthquake off the coast triggered a 13-15 metre tsunami. The waves exceeded the seawall, flooding key buildings and destroying equipment needed to prevent nuclear meltdown.

Such incidents have prompted more and more countries to phase out the use of nuclear energy. These countries include Italy, Belgium, Switzerland, and recently, Germany. Chancellor Angela Merkel explained “After what was… an unimaginable disaster in Fukushima, we have had to reconsider the role of nuclear energy”. The country now plans to close all of its 17 nuclear reactors by 2022.

But despite its risks, nuclear energy has many redeeming qualities. It is energy dense, producing far more energy per unit of mass than any other source. It is cost-competitive thanks to low fuel costs. It provides more reliable base load energy than solar and wind power, which are weather dependent. And unlike fossil fuels, nuclear power generation produces minimal greenhouse gases like carbon dioxide and methane. In the words of Christine Todd Whitman, a former EPA Administrator, nuclear energy is “the country’s largest source of clean-air energy that’s available 24/7″ and is “a critical tool in combating climate change“. For these reasons, nuclear energy remains a key part of our energy mix, accounting for 20% of US electricity generation.

Where do you stand on nuclear power? To learn more, contact the New England Coalition on Nuclear Pollution, the Natural Resources Defense Council, Greenpeace, and Clean Air Cool Planet.

Starbucks Grinds Away at Hunger

Starbucks is synonymous with excess – after all, who really needs a $6.00, 440 calorie Salted Caramel Mocha Frappuccino? But now, the high-end coffee company is using its excess for a good cause. It is donating 100% of its leftover prepared meals to food banks and shelters through the FoodShare program.

Starbucks’ involvement in the program dates back to 2010. At that time, the company partnered with Food Donation Connection to donate unsold pastries. All other food, however, was simply wasted at the end of every day. Thus, baristas urged management to expand the program to include perishables, noting “its frustrating to throw away so much food—especially because you know that there are people that need it”.

According to Starbucks Brand Manager, Jane Maly, the challenge was preserving the food’s quality during delivery “so when it reached a person in need, they could safely enjoy it”. The solution arrived in the form of a fleet of refrigerated trucks. The trucks can visit the chain’s 7600 U.S. locations throughout the day to collect any unsold, edible items (including breakfast sandwiches, salads, paninis, Bistro Boxes etc). They can then deliver the items to the Feeding America network, the largest hunger-relief and food-rescue charity in the United States.

Starbucks first piloted the program in Arizona last July. Managers figured that if they could keep food cold and fresh in the heat of the Sonoran desert, they could do it anywhere! After a successful pilot, the company introduced the program nationwide (at participating Starbucks) in March 2016.

The coffee house predicts that it will donate almost 5 million meals by the end of its first year…and more than 50 million meals by 2021! This could take a sizable bite out of America’s hunger problem, which currently affects over 48 million people. As an added benefit, Starbucks will divert food waste from landfills, drastically reducing its environmental footprint. (Feeding America estimates that Americans produce 70 billion pounds of food waste every year!)

Starbucks also believes that its efforts could inspire other companies to do the same. “Our hope is by taking this first step, other companies will see the possibility for their participation and together we will make great strides in combating hunger”. These companies—ranging from grocery stores to restaurants—could even use the same fleet of refrigerated trucks. Then Starbucks’ impact could really go from “grande” to “venti”!

Want to join the fight against hunger? Complete an Op4G survey for one of our partners: Gleaners Food Bank of Indiana, Capital Area Food Bank of TexasOregon Food Bank, SF-Marin Food Bank, Vermont Foodbank, and the Second Harvest Food Bank of Santa Clara and San Mateo Counties.

 

 

Flint, Michigan: A City in Troubled Water

The residents of Flint, Michigan are no strangers to hardship. In the past three decades, auto plants have closed, the population has plunged, violent crime has spiked, and poverty rates have reached 40%. But in recent months, the blue-collar city north of Detroit has hit a new low. Lead has contaminated the city’s water supply, leading to widespread poisoning and a state of emergency.

The crisis originated in 2011. After years of economic decline, the city was “so broke that it was taken into state receivership”. Michigan Governor Rick Snyder promptly ousted the mayor and city council and appointed a series of emergency managers to govern and reduce costs. In 2013, one manager decided to switch Flint’s water source from the Detroit Water and Sewerage Department to the new Karegnondi Water Authority (both draw from Lake Huron). But as the connecting pipes were not yet built, he ordered officials to temporarily pump from the Flint River – at a projected savings of $1 million/year. Flint River water began chugging through city pipes in April 2014.

The Flint River is a “cesspool” tainted by farm runoff, sewage, and decades of industrial effluent. To make matters worse, the river’s water is highly corrosive to lead (19 times more than Detroit water!). Yet the state refused to add a required anticorrosive element, costing just $9000. As a result, the water corroded lead throughout the system, including lead pipelines connecting homes to city water mains and lead solder used to fuse copper pipes. This lead then leached into Flint’s water supply.

Almost immediately, residents complained about the cloudiness, colour, taste and smell of the city’s water. However, the effects were more than cosmetic. Many reported that the water was “making them sick”, causing rashes, hair loss, headaches, eye irritations, and other health problems. After a flood of warning signs, the city council called on the emergency manager to switch Flint back to Lake Huron water in January 2015. But the calls went ignored. [Outrageously, only GM received a “special hook-up to clean water” after car parts showed corrosion].

The EPA started to grasp the grave danger in summer 2015. In a June memo, an employee wrote “Recent drinking water sample results indicate the presence of high lead results in the drinking water”. Moreover, “The lack of any mitigating treatment for lead is of serious concern”. Still, a state of Michigan spokesperson advised Flint residents to “relax”, claiming that test results just didn’t hold water.

By late summer, however, the poisoned water supply became undeniable. Researchers from Virginia Tech found that some Flint water samples contained 13,200 ppb of lead – far above the EPA limit (15 ppb) and even the threshold for hazardous waste (5000 ppb)! Likewise, Dr. Mona Hanna-Attisha compared children’s blood with earlier samples, proving elevated lead levels.

But it wasn’t until October 2015, after months of denial and deception, that Michigan officials acknowledged the environmental nightmare. Governor Snyder switched Flint’s water back to Detroit’s system at a cost of $12 million.

Already, the damage was done. Besides the visible impacts, the lead has likely caused significant neurological damage, particularly in Flint’s 8,657 children under 6. According to the World Health Organization, lead exposure affects brain development resulting in “irreversible” effects like reduced IQ, behavioural changes (such as shortened attention span and increased antisocial behaviour), and reduced educational attainment! It can also cause anaemia, hypertension, renal impairment, and toxicity to the immune and reproductive systems.

The economic impact is also significant. Many victims now require costly healthcare and social services, including special education. The city’s $2.4 billion in home value has gone “down the economic drain”, leaving numerous residents with “a net worth of zero”. Moreover, many companies are struggling to stay afloat in Flint, while others are avoiding the city altogether.

Like Flint water, the solution to the disaster remains unclear. Under the current state of emergency, officials are using $5 million in federal funding to provide water, filters, water test kits, and other necessary items. Non-profit groups, like the American Red Cross and Food Bank of Eastern Michigan, are joining in. Furthermore, the Michigan National Guard and police are going door-to-door delivering water…and warnings.

Once basic needs are met (in the world’s richest country!), attention must turn to fixing Flint’s water system. Simply switching back to Lake Huron water was not enough, as the pipes are now severely corroded. Some suggest recoating the pipes with anticorrosive element but this could take over six months. Flint’s new mayor, Karen Weaver, argues that “we have been emotionally traumatized and need new pipes” (costing up to $55 million). Others call for a compromise: gradually replacing every lead service line, while coating pipes with phosphates.

Over the long-term, the government will need to fund various health and social services for victims. In the words of Dr. Hanna-Attisha, we need to throw “all of these wraparound services at children” or risk “lifelong, multigenerational consequences”. Governor Snyder is currently seeking $195 million for this purpose. Finally, come hell or high water, we must identify and prosecute the perpetrators of this fatal fiasco. Already, hearings are underway on Capitol Hill. In addition, the Department of Justice, FBI, and Environmental Protection Agency are investigating possible crimes, including misconduct in office and involuntary manslaughter.

Do your part! Please complete an Op4G survey for the American Red Cross, or donate to: the Food Bank of Eastern Michigan, the Community Foundation of Greater Flint, or the United Way of Genesee County.

 

Tiny Costa Rica sets Big Record for Renewal Energy

Last month, I travelled to the small Central American country of Costa Rica. Among other things, I marvelled at the lush forests, pristine beaches, and exotic wildlife. But perhaps the most admirable feature of Costa Rica is its small carbon footprint. According to sources, the country used only renewable energy to generate electricity for the first 75 days of 2015—a new world record! It also plans to produce 97% of its electricity from renewables this year.

Costa Rica’s strong “eco-friendly” spirit has motivated the transition to renewable energy. Polls suggest that approximately 80% of Costa Ricans believe in climate change, 87% support wind power plants, and 77% support geothermal plants. Meanwhile, “less than a quarter support the further use of oil”. Recent economic realities have provided added impetus. High oil prices from 2011-2014, and the bankruptcy of nearby oil exporter Venezuela, confirmed the need to move away from fossil fuels.

Fortunately, several factors have facilitated Costa Rica’s transition to clean energy. First, Costa Rica’s high tropical rainfall and mountainous interior are “well suited for hydropower”, which supplies 80% of the country’s electricity. This spring, “unusually heavy rainfall bolstered water reservoirs enough to produce ample power for the whole country”. Costa Rica also boasts a string of active volcanoes. The resulting geothermal energy accounts for 15% of the country’s electricity mix.

Of further note, Costa Rica has a “low heavy industrial base”. Instead, the country relies largely on “agriculture and tourism—particularly eco-tourism”. Costa Rica also has a low population of 4.8 million people, approximately equivalent to Alabama.

But despite these factors, the transition is not yet complete. The Rich Coast still relies heavily on fossil fuels for transportation. In fact, “vehicles consume a whopping 70% of all petroleum consumed in Costa Rica and account for 40% of the country’s total carbon emissions”. Experts blame aging vehicles, high-polluting engines, and a failed electric bus program.

Moreover, since most renewable sources are weather dependent, Costa Rica’s clean energy supply sometimes falls short. In these instances, utilities resort to burning dirty bunker fuel. (Regrettably, climate specialists expect changing rain patterns will make hydropower even less reliable).

Still, Costa Rica’s progress is an inspiration to other states. So too are the achievements of Denmark (40% wind energy), Iceland (85% geothermal energy and hydropower), and Sweden, Bulgaria, and Estonia, which have met their energy goals for 2020. Highlighting these success stories is critical in the lead-up to the UN’s 2015 climate change conference, which aims “to achieve a legally binding and universal agreement on climate”. In the words of Monica Araya, Executive Director of climate change think-tank Nivela, “It’s easier in the U.S. and elsewhere to move if [they] see others moving.”

To learn more about renewal energy, please contact  ACORE, the Foundation for Renewable Energy and the Environment, or our partners, Solar Sonoma County and Clean Air Cool Planet.

Adding Fuel to the Fire: Will Plunging Oil Prices Hurt the Environment?

2131Oil prices are in free fall. On Monday, the price of a barrel of West Texas Intermediate (WTI) crude fell below $50 for the first time since 2009. This compares to a price of $105 per barrel just 6 months prior. Certainly, a drop of this magnitude could have considerable economic and geopolitical implications. But how will it impact the environment?

On one hand, analysts argue that the slide in oil prices could generate environmental benefits and opportunities. Foremost, the lower price of oil reduces the incentive of energy companies to drill for more. According to one shale pioneer, companies will “pull back and won’t drill until the price recovers”. This is particularly true for high cost oil fields with break-even points above $50 per barrel. Many such fields contain heavier oils, which require significantly more energy to extract/refine and create greenhouse gas footprints “nearly twice as large as lighter oils”. Thus, in the words of the National Resources Defense Council, “Low prices keep the dirty stuff in the ground”.

Secondly, the drop in oil prices and production will hurt the energy sector’s bottom line. While big oil can likely weather the storm, many junior oil companies will struggle to secure enough profit and financing to remain in business. In fact, some believe that OPEC is intentionally suppressing oil prices to “clean up the marginal market”. Already, the number of junior oil companies in Canada’s oil sands has fallen from 94 in 2007 to 43 (as of Q3 2014). This decline in oil companies could further reduce oil extraction.

Additionally, low oil prices will make it more “politically feasible to implement the carbon pricing reforms…necessary for significantly reducing emissions”. Consumers will “more readily accept” a carbon tax, for example, than when oil prices are already high. Hence, according to Virgin Airlines owner Richard Branson, “If governments want a carbon tax…[2015] would be the best time”. Former Treasury Secretary Lawrence Summers adds that “It would be a hugely important symbolic step ahead of the global climate summit in Paris late this year”.

But lower oil prices could also produce significant adverse environmental effects. The clearest effect is on oil consumption, which produces the greenhouse gases linked to climate change. Basic economics dictates that as the price of oil declines, there will be “more oil use now”. The same is true for oil products (e.g. gasoline) or complements (e.g. cars and flights).

Consider gasoline, for example. Cheaper oil translates into cheaper gasoline, as “crude oil accounts for about half of the price of gasoline at the pump”. In the short-term, this may lead to longer or more frequent vehicle use. Over the long-term, it may encourage consumers to purchase less fuel efficient vehicles or homes further from the city. Consequently, economists estimate that a 25% drop in gasoline prices could increase gasoline consumption by 2 – 5% immediately and 10 to 20% over the long-term.

With increased consumption of oil and oil products, the demand for alternative energy will also fall. This includes wind, geothermal, and solar energy (which was cheaper than oil before the drop). Likewise, “a sustained period of low oil prices will dampen investment in alternative technologies”, which appear “less urgent”. Governments could attempt to counter such effects through subsidies to producers or consumers. However, subsidies would need to increase as the price gap between oil and alternative power grows. For these reasons, some maintain that collapsing oil prices will “derail the green energy revolution”.

In sum, the net environmental impact of plummeting oil prices is “not immediately clear”. What is clear is that “Whether oil’s price tag is high or low, neither ensures climate protection”. So what, then, is the best long-term price of oil, from an environmental perspective According to experts at Harvard University, the ideal price range could be between $60 and $80. At $75 a barrel, for example, the price is “high enough to keep investments flowing into alternatives, while giving energy companies less reason to pursue expensive and risky oil fields that also pose the greatest threat to the environment”.

To support or learn more about alternative energy, please contact Clean Energy Now, GRID Alternatives, or our partners, Solar Sonoma County[xxxi] and Clean Air Cool Planet.

Flickr photo credit: taylorandayumi

An International Victory! Ozone Layer Starting to Rebound

66It is a (rare) great day in environmental news! According to a United Nations report, the ozone layer is “showing its first sign of recovery after years of dangerous depletion”. In fact, scientists estimate that ozone levels climbed 4% in key mid-northern latitudes from 2000 to 2013, representing a “significant and sustained increase”.

The ozone layer, or ozone shield, is a layer in the earth’s stratosphere that absorbs most of the sun’s ultraviolet (UV) radiation. In excess, this radiation can “penetrate organisms’ protective layers, like skin, damaging DNA molecules in plants and animals”. Consequently, scientists have linked ozone depletion to skin cancer, cataracts, crop damage, and the destabilization of the aquatic food chain. In fact, the UN projects that restoring the ozone layer will prevent 2 million cases of skin cancer annually by 2030.

Scientists are calling the improvement a “victory for diplomacy and for science”. After all, the 1987 Montreal Protocol – ratified by all 197 UN members – “banned or phased out ozone depleting chemicals, including chlorofluorocarbons (CFCs)” used in refrigerators, air conditioners and aerosol cans. In the words of Achim Steiner, Executive Director of the U.N. Environment Program, banning such chemicals is “one of the great success stories of international collective action in addressing a global environmental change phenomenon.”

So what made this collective action possible (when it eludes us on climate change)? Political scientists point to numerous factors. First, unlike slowly rising temperatures, ozone depletion was visible. Satellite images in the 1980s showed “shocking” ozone holes over the Polar Regions. Similarly, there were clear causal links between ozone depletion and its human impacts. People better understood how the ozone holes (rather than climate change) could impact them directly.

Additionally, several major countries were quick to join the fight against ozone depletion. The US, for example, ratified the Montreal Protocol in early 1988 (yet never ratified the Kyoto Protocol). This, in turn, encouraged other countries to ratify, largely due to the protocol’s trade provisions.

Finally, unlike climate change, ozone depletion could be traced to a limited set of chemicals produced by a small number of companies. Though some of the companies (e.g. Du Pont) lobbied strongly against the Montreal Protocol, they ultimately succeeded in developing superior chemical substitutes. In this regard, some say that the Protocol benefitted both environment and industry.

But don’t throw away your sunscreen just yet. According to a 300 scientist panel, the ozone layer is “just starting to heal”. The layer is still about 6% thinner than in the 1980s and the ozone hole is about 20 million square kilometers (compared to 30 million in 2006). At this rate, the UN believes that it will take until 2050 for the ozone layer to return to “relatively healthy 1980s conditions” (2075 around Antarctica).

To learn more, please contact the Natural Resources Defense Council, the David Suzuki Foundation, or the Canadian Wildlife Foundation.

Flickr photo credit: NASA Goddard Space Flight Center

 

 

In the Red: World Surpasses Annual Natural Resource Budget

picwaterLast week, on August 19, the planet reached Earth Overshoot Day. Also known as Ecological Debt Day, the date marks the point at which “humanity has used up its natural resource budget for the year”, including land, trees, and food. In other words, “our use of resources has started to exceed the Earth’s ability to regenerate”.

The Global Footprint Network, a California-based non-profit, calculates Earth Overshoot Day annually. The Network divides the “world’s biocapacity – the amount of natural resources generated by the planet that year – by humanity’s natural consumption of Earth’s resources”. It factors in approximately 6,000 data points for 230 countries, territories, and regions. It then multiplies the total by the number of days in a year. According to the Network, the result is accurate within 15%.

The arrival of Earth Overshoot Day on August 19 is certainly worrisome. As noted in the Daily Mail, “Earth is in overdraft just EIGHT months into the year”. But even more alarming is the shift in the Earth Overshoot Day over time. In 1961, “humans used only around 3/4 of the Earth’s capacity for generating food, timber, fish and absorbing greenhouse gases, with most countries having more resources than they consumed”. Since then, however, the earth overshoot day has crept up earlier each year. In 1967, the US began consuming more than its natural budget. By the 1970s, “enough countries had moved from ecological creditor to ecological debtor status that the earth as a whole was overshooting its sustainable supply of critical resources”. This upward creep has accelerated in recent years. In 2000, the Earth Overshoot Day was November 1. In 2009, it was September 25.

According to Mathis Wackernagel, President of the Global Footprint Network, the Global overshoot is becoming “a defining challenge of the 21st century”. Already, in 2014, it would take 1.5 Earths to produce the renewable resources needed to support current human consumption. By mid-century, however, “moderate population, energy and food projections suggest that humanity would require the biocapacity of 3 planets”. The resulting “deforestation, fresh-water scarcity, soil erosion, biodiversity loss, and the build-up of CO2 in our atmosphere” will come with enormous human and economic costs. Countries with resource deficits and low incomes are particularly vulnerable, but high-income countries will also pay a price.

In response, the Global Footprint Network urges countries to implement long-term solutions before “such dependencies…turn into a significant economic stress”. In the words of Wackernagel, “Government can always print more money, but it can’t print more planet. Ecological overshoot should lead the political agenda”. Already, Global Footprint has inspired several countries to take action. The Philippines is implementing national land-use planning, Morocco is investing in sustainable agriculture systems, and the UAE is installing high-efficiency lighting. Individuals can also act. Using the organization’s personal calculator, they can pinpoint their own overshoot day, and reduce their consumption accordingly.

To learn more, please contact the Global Footprint Network, the Environmental Defense Fund, or our partners, the Ocean River Instituteand Solar Sonoma County.

Flickr photo credit: Bryan Wysoglad 

Waste Not, Want Not: Solving the Global Food Crisis

pic3Ever find yourself throwing out last week’s groceries, or only finishing half your meal? You are not alone. According to the latest research from the Food and Agricultural Organization (FAO), roughly 1/3 of food produced for human consumption is lost or wasted globally. This amounts to 1.3 billion tonnes of food or 57% of all the calories harvested each year.

The magnitude and cause of the problem vary by region. In the developing world, per capita food waste equates to 6 – 11 kg/year. Most of this waste occurs at the start of the supply chain due to inefficient harvesting, inadequate local transportation, and poor infrastructure (e.g. a lack of refrigerating facilities). In the developed world, however, per capita waste equates to a shocking 95 – 115 kg/year. Waste occurs primarily at the end of the supply chain, as supermarkets “often reject entire crops of perfectly edible fruit and vegetables…because they do not meet exacting marketing standards”. Consumers also purchase “excessive quantities” of perishable foods and dispose of food early due to “confusing” food labels.

The human impact of such waste is considerable. After all, about 1 billion people globally (or 1 in 7) continue to suffer from malnutrition or starvation. According to the Feeding 5000 Campaign, the vast quantity of wasted food “would be enough to satisfy the hunger of every one of them”.

But “wasting food means losing not only life-supporting nutrition”. It also mean losing “precious resources, including land, water and energy”. In fact, producing wasted food requires 28% of global farmland – approximately the size of Mexico. It uses enough freshwater to meet the domestic needs of 9 billion people (200L each/day). Furthermore, since each calorie of food takes an average of 7 – 10 calories to produce, it consumes a significant portion of the global energy budget. This, in turn, generates 6-10% of greenhouse gas emissions, contributing to “to unnecessary global warming”.

To compound the problem, the United Nations projects that the global population will reach 9.5 billion by 2075 (based on mid-range forecasts). This represents over 2 billion more mouths to feed. Furthermore, “substantial changes are anticipated in the wealth, calorific intake, and dietary preferences of people in developing countries”. Hence, the demand for food—particularly resource intensive food like meat—is expected to increase.

Fortunately, awareness of global food waste is also growing and key players are starting to act. The United Nations, for example, has launched the Think.Eat.Save program, which works to galvanize global action and exchange ideas. Farmers are donating “edible but imperfect-looking” crops to local food charities, such as City Harvest. Food processors are finding innovative ways to salvage previously rejected foods, such as “making baby carrots out of carrots too bent to meet retail standards”. And grocery stores like Waitrose and Sainsbury are cutting the prices of expiring goods, donating leftovers to charities, and sending remaining food waste to bio-plants for electricity generation.

Such efforts offer real hope of solving the global food problem. In fact, the United Nations reports that “cutting the rate of food loss and waste in half by 2050 would close 20% of the (expected) food gap.”

To learn more about the food waste problem, please click here. To help reduce hunger, please donate to one of our partner food banks: Friends of Saint Joseph’s Food Pantry, Operation: Sack Lunch, the Oregon Food Bank, the San Francisco-Marin Food Bank, the Vermont Food Bank, and the Second Harvest Food Bank of Santa Clara and San Mateo Counties.

Flickr photo credit: Kabsic Park

 

The Clean Power Plan: Obama’s “Last Sweeping Effort to Remake America”

wind5Earlier this week, President Obama unveiled his latest plan to reduce greenhouse gas emissions and slow climate change. Dubbed the “Clean Power Plan”, it calls for the power industry to cut carbon dioxide emissions 30% by 2030 (from 2005 levels).

To achieve this reduction, the plan establishes individual targets for states. The targets reflect each state’s economy, current emission levels, and capacity for cuts. For example, coal-dependent Kentucky has a target of 19%, while hydro-rich Washington has a target of 84%. The plan, like the Affordable Care Act, also gives states great flexibility in meeting their targets. Among other actions, states can shut down coal plants, invest in wind and solar power, install energy efficient technology, join a cap-and-trade program, or enact a state tax on carbon pollution.

From a purely environmental perspective, many are hailing the plan. After all, the United States’ roughly 1000 power plants account for nearly 40% of all US carbon emissions – and currently face no carbon pollution limits. They also emit harmful particle pollution, nitrogen oxides, and sulfur dioxide, which are expected to decline by over 25% as a “co-benefit” of the plan.

Some also believe that the Clean Power Plan will shape climate change policy abroad. Christiana Figueres, Executive Secretary of the UN Framework Convention on Climate Change, claims that she “fully expects [the US plan] to spur others in taking concrete action”. (This includes countries like Canada, which have largely tied their emissions targets to the US). Moreover, by sending “a strong signal that the US is serious about reducing carbon pollution”, the plan will “give Washington legitimacy in international talks next year to develop a framework for fighting climate change”.

But despite all the praise for Obama’s “most ambitious plan yet“, not all environmentalists are giving the plan a green thumbs up. First, some feel that the plan does not go far enough. They note that carbon emissions already fell 15% between 2005 and 2013, in part due to the “retirement of coal plants in favor of cleaner-burning natural gas”. This suggests that the 30% target by 2030 is “arguably easier to reach”. In the words of Andrew Revkin of the New York Times, the “Obama plan in effect takes twice as long (16 years) to cut as much carbon pollution as the country just did in 8 years”.

Furthermore, the Obama plan is limited to only one sector of the economy: power generation. According to the Food & Water Watch and the Institute for Policy Studies, this narrow scope will leave the US “far short of the IPCC’s goals for developed countries of economy-wide reductions of 15 to 40% below 1990 emission by 2020”. In fact, with these targets alone, US emissions will still exceed 1990 levels by 2030.

Finally, the Obama plan aims to phase fossil fuels (particularly coal) out of the US energy mix.  However, until economically competitive substitutes hit the market on a large scale, this may be a “waste of effort”. As highlighted by Steve Cohen, Executive Director of Columbia University’s Earth Institute, “The vast increases in fossil fuel based energy use in China and India alone virtually guarantee continued global warming. Only a lower-priced, reliable, and convenient replacement for fossil fuels will make a difference”.

So what can we make of this debate? Will the Clean Power Plan turn the “red, white, and blue” green? Certainly, by targeting the single largest source of carbon dioxide in the United States, the plan is a “creditable” start. And, without question, it is better than the status quo (no carbon regulations for power plants). However, the plan alone is not the panacea for global climate change. To make real headway, it must be complemented by investment in research, development, and dissemination of renewal substitutes.

To learn more about the Clean Power Plan, please contact the Center for Climate and Energy Solutions, the US Climate Action Network, or our partners: Clean Air Cool Planet and the American Lung Association in California.

Flickr photo credit: Thunderbolt_TW

 

Global Warming a “Death Sentence” for Coral Reefs

coral2In April, I spent a sun-soaked week in Santa Lucia, a beach town on Cuba’s northern coast. At the encouragement of the locals, I went snorkeling in the nearby coral reef (one of the longest in the world). I was amazed by the diversity of sea life.  Among other creatures, I saw snappers, needlefish, and even a barracuda!

But, due to a phenomenon known as “coral bleaching”, these creatures and their habitat are under increasing threat. Coral bleaching occurs when coral expels the colorful algae (zooxanthalle) living in its tissues. The tissues becomes transparent and “the coral’s bright white skeleton is revealed”. Although bleached corals are not yet dead, most “begin to starve once they bleach”, as the zooxanthalle “provide up to 90% of the energy corals require to grow and reproduce”. Thus, unless conditions return to normal, most bleached corals eventually die and decay.

So why does coral expel algae? According to an expert at the Australian Institute of Marine Science, “any environmental trigger that affects the coral’s ability to supply the zooxanthellae with nutrients for photosynthesis will lead to expulsion”. These triggers could include changes to water salinity, quality (from urban and agricultural runoff), and sedimentation (from underwater activity or massive dust storms). They could also include oxygen starvation or diseases/infections in the coral.

However, the most common cause of coral bleaching is a rise in sea temperatures—from global warming, el Niño etc. In fact, a temperature increase of a mere 1 degree Celsius for four weeks can trigger bleaching. If these temperatures “persist for longer periods (eight weeks or more), coral begin to die”.

For example, in 2005, the US lost half of its Caribbean coral reefs when the waters near the Virgin Islands and Puerto Rico produced a “thermal stress…greater than the previous 20 years combined”. Similarly, in 1998, 50% of the Great Barrier Reef suffered bleaching as “sea temperatures on the…Reef were the highest ever recorded“. But the most severe bleaching from warm water has occurred in the Indian Ocean. Consequently, “up to 90% of coral cover has been lost in the Maldives, Sri Lanka, Kenya, Tanzania, and the Seychelles”.

This large-scale coral bleaching poses a significant environmental problem. Although reefs comprise less than 1% of the earth’s underwater ecosystem, they shelter a full 25% of marine species. Specifically, they provide “spawning, nursery, refuge, and feeding areas for a large variety of organisms, including sponges, cnidarians, worms, crustaceans, molluscs, echinoderms, sea squirts, sea turtles, and sea snakes”. If these species dwindle, there will be “a tremendous cascade effect for all life in the oceans” potentially leading to a “complete collapse of the marine ecosystem”.

Such an outcome would also put humans in hot water. According to the United Nations, approximately “1 billion people, largely in developing countries, rely on fish as their primary animal protein source”. A further 3.2 billion consume fish for over 15% of their animal protein. Ocean fisheries also provide direct employment to about 45 million people and indirect employment to 180 million people. Add in their families and a full 540 million people “depend on some aspect of catching, farming, processing, or distributing fish for their economic wellbeing”. Furthermore, coral and marine life are a “significant attraction for the tourism industry”. In fact, “many Caribbean countries get nearly half their gross national product from visitors seeking tropical underwater experiences”. Given these realities, experts suggest that, “If the reefs vanished… hunger, poverty, and political instability could ensue.”

Of further note, “reef structures play an important role as natural breakwaters”. They help minimize the impact of waves from storms (e.g. cyclones, hurricanes, or typhoons) on coastal communities and beaches.  Many also believe that corals could be home to the next medical breakthrough. Though 95% of the ocean remains unexplored, scientists have already discovered key treatments for asthma, arthritis, and cancer in the so-called “underwater pharmacy”.

Regrettably, as the climate changes, “coral bleaching is predicted to become more frequent and severe”. In fact, the IPCC’s moderate warming scenarios (B1 to A1T, 2°C by 2100) forecast that “corals on the Great Barrier Reef are very likely to regularly experience summer temperatures high enough to induce bleaching”. This has led some experts to predict that “most of the world’s coral reefs could be killed within our lifetime”.

To avert such a future, non-profit organizations are taking both proactive and reactive measures. They are encouraging others to lower their carbon footprints and to lobby their public officials for comprehensive climate legislation. They are collaborating with countries to establish marine protected area networks. Moreover, they are working to increase the resiliency of coral reefs by developing, sharing, and implementing science-based strategies to “better respond to bleaching events”.

For more information on coral bleaching, and how you can help, please contact the Nature Conservancy, the World Wildlife Fund, the National Wildlife Federation, or our partners, the Ocean River Institute, the Living Planet Aquarium, and the Blue Ocean Society for Marine Conservation.

Flickr photo credit: Only Point Five