News + Media

Our optimism is rooted in the ingenuity of the world’s farmers.
Recent Event
Cargill

Creating a more food-secure world through adaptation and resilience

MIT Global Change Forum - Boston, Massachusetts

Greg Page, Cargill Chairman and Chief Executive Officer
June 4, 2013

(As prepared remarks)

I am very happy to be here this evening and glad that John Reilly extended the invitation. The theme for this year’s forum is “Water, Food and Energy in a Changing World.” In the past, I know these forums have focused on constrained resources like water and energy, so tonight we will talk more about food – which is the great natural combination of water and energy. 

My remarks this evening hopefully will be a good kickoff for two interesting days talking about the intersection of water, food and energy, which are more closely linked than ever before. 

There could be 9 billion people on this earth in 40 years….and we will feed them. How can we do it given the myriad factors – including climate – that are part of the food security puzzle? It's a challenge that will take our collective wisdom to solve. And it will require adaptive behaviors and resilience.

About Cargill

To give context for my remarks tonight, let me share just a few comments about Cargill.

Cargill is a company that began in the Midwest almost 150 years ago and has grown over time to where two-thirds of our employees are outside the U.S. In short, Cargill has globalized along with the world’s GDP.

Cargill operates in four key segments. The first is the business of taking food and crops from times and places of surplus, to times and places of deficit. That traditional role of Cargill in grains and primary oilseeds represents about 25 percent of the company.

The next segment is providing farmers with a variety of services and access to markets.

Third is our food and meat businesses. Our businesses here include cocoa and chocolate, malt, corn milling, flour, salad dressings, vegetable oil, and a fairly significant meat business.

Finally, Cargill has a risk management business. We trade ocean freight, coal, electricity, natural gas, petroleum, iron ore and basic metals. Clearly the prices of these commodities, particularly freight, petroleum and energy, have a dramatic impact on agriculture.

Cargill. Agriculture.

MIT’s Global Change program is one of several university programs Cargill funds that help to better understand climate science, impact on crop yields, sustainability and implications for food security.

Cargill and MIT

Cargill has been a sponsor of the MIT Joint Program on the Science and Policy of Global Change since 2008. I know this forum has gained an international reputation for serious and frank discussions of global issues.

MIT’s program is one of several University programs Cargill funds that are helping us better understand climate science, impact on crop yields, sustainability and implications for food security. (Others include Stanford’s Center for Food Security and the Environment, and the University of Minnesota’s Global Landscapes Initiative).    Cargill also has engaged with various think tanks to build our understanding of climate change issues, including Resources for the Future.   We also appreciate the work of IFPRI – the International Food Policy Research Institute – who is on the agenda tomorrow…and others in the nonprofit / IGO / academia space who are devoting their time and talent to the challenge of feeding the world.

Some of Cargill's most important philanthropic partnerships are with nonprofits like CARE, Feeding America, The Nature Conservancy, TechnoServe and others who are trying to make the world more food secure by either raising incomes, expanding access to food or ensuring that food production is done in an environmentally responsible way.

The complexity of food security

Of all the challenges facing our world today, none is more immediate than the need to provide sufficient nutrition for all. Food security involves interdependent parts, and having all those parts working together is what is complicated.

The globe’s population is not only increasing, it is becoming more urban and more affluent.  Our ability to meet that challenge is affected by these factors:

  • Diets are changing as income levels rise.
  • Biofuels have become a significant consumer of traditional crops.
  • Public investment in agricultural research has been declining.
  • Government policies that inhibit trade or limit productivity are affecting food availability and price.
  • And localized supply shocks and production shortfalls continue to occur – although in 2012 we had adequate production – even with the U.S. drought –we just didn’t share well.

Some people question whether we can grow enough food, especially in a world that needs to adapt to changes in climate.

Our optimism is rooted in the ingenuity of the world’s farmers.
Our optimism is rooted in the ingenuity of the world’s farmers. They are natural innovators - adapting to changes in the environment and technology.

Cargill is optimistic

At Cargill, we are optimistic. We believe that the world can feed itself and that we can harness the power of photosynthesis to produce all the nutrition needed for an increasingly prosperous world.

Our optimism is rooted in the ingenuity of the world’s farmers. They are natural innovators - adapting to changes in the environment and technology – proven by the doubling of the amount of grains, rice and oilseeds that they have produced since 1975—without a significant increase in acreage, much of that coming from double cropping.

I will try to show how the stakeholders in world food production are exhibiting the adaptive behaviors that underpin resilience and provide us a more food-secure world.

Can the food systems we rely on adapt?

If we are in a period of accelerated climate change, the question is whether the food systems upon which we rely can adapt.

Sometimes when we hear the word resilience applied to agriculture, we think of a hard-working and stoic farmer valiantly saving his crop from pests, drought and frost.  What I’m trying to convey is a much broader notion of “systemic resilience,” where all major stakeholders in the global food system are poised, able and willing to build solutions to broad challenges in an effective and most important, a complementary way. A global system that is sufficiently flexible to produce enough food despite localized disruptions.

The resilience of farmers

Let’s start where it all begins. On the farm.

While many people are working on climate change mitigation strategies….the farmer will be busy doing that and doing what he or she has always been doing: adapting! Clearly we need both adaptation and mitigation….it is not an either/or choice.

Farmers are the consummate optimizers. Every year, they look in their field, and look at what has been dealt to them. And -- in the best-run countries -- at the last minute, they make a decision about how to optimize their profitability – to grow what the market is signaling to the best of their ability. They look at input costs, forecasts, relative output prices, soil moisture. And then they plant.

There aren’t many of us that can come to work and turn on a dime as skillfully and naturally as farmers.  And while many of my examples today relate to modern Western agriculture, we also see resiliency in smallholder farmers. 

The power of price

To farmers, rising commodity prices are a potent fertilizer, motivating them to produce more when the market calls for it. In late March, the USDA predicted U.S. farmers would plant the most corn since 1936 – about 97 million acres. And 77 million acres will be planted with soybeans.  That acreage is predicted to deliver huge harvests. The predictions come with a caveat of course; they are predicated on a return to reasonable weather during the growing season.

If we take good prices to farmers, incredible progress can be made on food security. In developing countries it is an issue of the economic capacity of non-farmers to put enough price into their agricultural systems to create sustainable agriculture.

Growth in non-farm income is a precondition for agricultural development in emerging economies.  Climate, water, seed, technology and agronomy are all important. But the fundamental ingredient of sustainable agriculture is an adequate price to reward the farmer for her efforts.  Without the signaling power of price, there will be no change in farmer behavior.

Better technology at an accelerated rate

Because of recent high prices, farmers have had a run of prosperity and are gobbling up technology like never before. Better technology is coming into agriculture at an accelerated rate. And farmers are willing to invest. Cargill has a role in this, facilitating the transmission of price signals and bringing farmers technology and risk management options so they can make decisions that maximize their profitability.

Here is an example. Some of you may know that at least in the middle of the country, we had a very late-arriving spring, and planting was delayed. But in another example of resilience, in just one week, 52% of Minnesota’s corn crop was planted. That is the fastest one-week corn planting on record.

Cargill. View from tractor seat. Planting a field.
A farmer plants his crops at night. View from the tractor seat. You can see the monitor, the light bar on the hood and almost no visibility in the dark, yet it is still operating – because the GPS is steering.

Adaptation at work 

Here you can see a photo from one of our farmer customers who is planting at night.  From the tractor seat you can see the monitor, the light bar on the hood and almost no visibility in the dark, yet still operating – because the GPS is steering.  About 75% of our customers can now plant 100% of their corn and soybeans in seven days or less.

And one of our customers in Ohio planted 6,000 acres of corn and soybeans in four and a half days. That is what adaptation, profitability and reinvestment has done on the farm.  

Key driver of production: yield increase 

Over a long period of history, the main contributor to increased food production has been yield gain through genetic improvement and fertilizer use. Acreage has been relatively stable from 1975 to the early 2000s. Only recently have we seen harvested acreage increase.

Farmers have met the challenge of increasing demand for food. But the environmental price of our practices in some cases was too high, and the debt to Mother Earth is now being repaid through a host of remediation efforts such as ag setbacks, and buffer strips to prevent run off; better tillage and reduced tillage practices; restoration of wetlands; and bio-digesters that recover energy from dairy waste – to name just a few.

Cargill. Satellite imagery for determining yield potential and crop inputs.
A satellite image for precision agriculture. Cargill's Next-Field system uses satellite images and soil sampling to determine yield potential and crop inputs.

Precision agriculture

One of the ways farmers are meeting the challenge is through optimization of inputs. Precision Agriculture is the integration of all sorts of optimization tools.

Basic precision agriculture uses satellite images and soil sampling to come up with an average yield potential for a field, and crop inputs like fertilizer are applied based on average yield goals. Cargill’s Next-Field system goes several steps further and develops 2.5-acre yield zones within a field to more precisely apply inputs where they make the most difference, based on that area’s individual yield environment – and in the process reduces waste and environmental impact. We should be impressed that the free market, without any intervention, has incented the conservation of valuable resources and improved sustainability.

Another example of resilient behavior in farmers was featured in a May 20 New York Times story about the Ogallala Aquifer. The aquifer is under depletion stress in High Plains states as a result of intensive farming and drought. The story was loaded with examples of how farmers are adapting to this changing environment: switching to raising dairy heifers, or switching to less water-thirsty crops such as sorghum. Or deciding to rely on rain alone and the lower resulting corn yields.

Farmers also take steps to protect moisture and soil conditions in their fields through conservation tillage and tractors with wide tracks that have a light foot print. They “tip toe” across fields so they don’t cause soil compaction.  Our industrial innovators captured this market opportunity, proving their resilience. 

In addition to farmers, livestock producers are optimizers, too. For example, there has been a huge surge in the addition of enzymes to animal feed, born largely out of rising ingredient prices partially connected to the ethanol boom, which signaled a need to improve feed conversion into meat and milk.

The resilience of governments and policymakers

Let’s move off the farm and into the halls of government. How are governments and policymakers showing resilience?

Fortunately, when it comes to behaviors that distort markets and disincentivize farmers, today governments for the most part are showing more restraint. Unlike the embargoes we saw in the 2008-2009 time period, there have been fewer market- and price-distorting behaviors of late, such as artificially suppressing prices, hoarding supplies or banning imports or exports.

But there are exceptions. We have seen this clearly in the Indonesian beef market, where steps were taken to block live cattle and boxed beef imports in an effort to spur local production. This has resulted in dramatically higher prices for Indonesian consumers and lower supplies on grocery store shelves.

For those of us who believe the economist David Ricardo, we know that self-sufficiency is not the answer. The world will always raise the most food the most economically and in the most environmentally responsible way when farmers plant the right crops for their local climate and soils using the right technology, then trade with others.  If every government set a goal of food self-sufficiency, the world would have much less food.

Resilience in the world’s largest agricultural economy

Another example of resilience is what has been happening in China. Policymakers in the world’s largest agricultural economy have shown their ability to adapt and change behaviors.

China has helped the world in aggregate produce more food through its decision to honor comparative advantage and import soybeans.  When it focuses on those areas where it has an advantage – using its scarce land to produce corn, wheat, rice, which yields relatively better in China, then imports soybeans and vegetable oils, which yield relatively more poorly in China  – the world in total raises more food.

The challenge for China now is that it built its model for agriculture on the assumption of inexpensive and widely available labor, and a multi-cropping environment. As wages rise, urbanization continues, and agricultural land reform evolves, China again will be tested in terms of its resiliency.

Africa’s critical role in feeding the world

Africa is another example where government action and policymaking is so critical to our ability to feed the world.

In short, Africa has the soil and the rainfall -- but not the policy, infrastructure and rule of law -- that are necessary, along with higher non-farm income, for increasing food production. But we see positive changes in the works.

Some of the lowest productivity gains over the last 40 years have existed in much of Africa.  But there are countries on the continent that have shown resilience and their commitment to change the face of agriculture.

As just one example, Nigeria is working hard to transform its agricultural sector, by their own words, “treating agriculture as a business and not a development program.” And treating farmers as business people and not aid recipients. They are interested in attracting private sector investment, and creating the right conditions for both smallholder and large-scale farmers to succeed and collaborate.

Ensuring production through science and innovation

How are policymakers helping ensure the availability of food through their support of science and innovation? I believe the acceptance of science is the foundation to being resilient.

Clearly the world has benefited from the application of food technology and particularly, the appropriate and well-regulated use of genetic engineering to create foodstuffs that are cheaper to produce and require less water, chemicals and tillage.

But the use of that technology is under debate in the United States. In American agriculture, resistance to genetically modified (GM) products was pretty much seen as a European issue, but now it is an American issue. While a ballot initiative in California that called for front-of-package labeling of any GM product was proposed and then defeated, it became clear that there was not sufficient understanding of GM by the public. We need to reach out to both governments and consumers to better explain the benefits of this technology. We need to gain society’s permission to use sound, proven and well-regulated science in the production of food.

The resilience of consumers

While producers and policymakers are changing their behaviors, so are consumers.

I would argue we have seen the resilience of the world’s population as they faced higher food prices in the past six years.  As food prices spiked, the conventional wisdom was that it would be a big challenge for developing countries…and that they would have no defense against the rising cost of food.

But rather, GDP rates in developing economies have continued to climb over that period.  Part of the reason is that as much as 70% to 80% of the population is involved in farming….and higher prices have created an income opportunity for farmers. In fact they have been thriving….because they are producing and selling into this better price environment.

Let me share an astonishing fact with you... Based on our tracking of global incomes, we have seen real, inflation-adjusted GDP of the least affluent 70% of the world's population more than double in the last 10 years.  To me that's incredible resilience during a period so widely lamented.

You may be surprised to hear that we collectively return less than 2% of global GDP to farmers for the calories they bring us in basic foodstuffs.  We can afford to be thoughtful in how we compensate them.

Food is emotional

We also need to acknowledge that food is personal.

In developed countries, people increasingly want to know the "story" behind their food. Just look at the bookstore shelves to see how much we as consumers contemplate our diets. Food is emotional, but we need to address it with hard science in a resource-constrained world. We shouldn’t return to medieval agriculture.  We need a science-driven agenda, not an emotion driven one.

Global production and consumption

So what kind of results does this resiliency produce?

This chart shows that the world’s variability from year-to-year in its annual total crop production versus trend line is no greater than it was 35 years ago.  So despite the headlines that the world has become a far more desperate place and a far more volatile place, actual data on deviations to trend line in global agricultural production is not different from the 1970s. We don’t deny the possibility, or even the likelihood, of looming challenges ahead related to agriculture, but we haven’t seen the impact of climate change in our data so far. Climate change is a risk we cannot throw off frivolously. We need the scientific community -- and many of you in this room -- to better define the risk -- based on science. 

The changing Canadian Prairie provinces

In North America, we are seeing changes not just in how food is produced, but where it is produced.

I grew up in North Dakota, where the big question up for debate in the spring was: would you plant your wheat on May 20 or May 28? Now the question is much more complicated. Your options are not just wheat, but now corn, soy, canola, sunflower and lentils enter the equation. Clearly, the variety of crops being grown in Bottineau County North Dakota is quite different from when I graduated from high school. So what does this example show?

Is the planting of corn a response to an increased number of frost-free days in the Northern latitudes? The answer is yes. At least in this microclimate it seems like something structural is happening. But I would argue that genetics, price and crop insurance are at least as contributory factors as frost-free days. Farmers’ decisions are based on a host of elements.

It is also true that these same factors are driving investments in Canada’s Prairie Provinces of Manitoba, Saskatchewan and Alberta….by Cargill and others, and that Canada’s grain mix is being transformed.

While changes in temperature and moisture may play out differently on different continents, I was encouraged to see the preliminary research that MIT presented at the last Forum on the world’s breadbaskets. I recognize that this work is preliminary but I congratulate MIT for tackling such a highly charged issue. In the absence of climate mitigation policies, the ability to adapt is critical. That is why this research is so important.

The importance of working together 

Agricultural production has always been affected by variability in weather...and farmers have adapted strategies appropriate to their local situation.

Who knows when and how much we will tip the scales from weather volatility to climate change? We all need to look carefully for this future.  We are looking for scientists to help us with these questions, to help define the borders and the scenarios so we can help create a more sustainable food system.

I believe that we have the power to adapt, and that the resilience we have shown in the face of change will continue.

So if there is one point to leave you with, it is the importance of working together on this issue of feeding the world. That is why we are so pleased to see MIT involved in this work.

Cargill is optimistic that we can, in fact, feed our world – even in a changing environment.

NOTE: These are the speaker’s “as-prepared” remarks.

E2e
News Release
MIT News Office

June 17, 2013 E2e
Vicki Ekstrom
MIT Energy Initiative

Energy efficiency promises to cut emissions, reduce dependence on foreign fuel, and mitigate climate change. As such, governments around the world are spending tens of billions of dollars to support energy-efficiency regulations, technologies and policies.

But are these programs realizing their potential? Researchers from the MIT Energy Initiative (MITEI) and University of California at Berkeley’s Haas School of Business have collaborated to find out.

The researchers’ energy-efficiency research project, dubbed “E2e,” is a new interdisciplinary effort that aims to evaluate and improve energy-efficiency policies and technologies. Its goal is to support and conduct rigorous and objective research, communicate the results and give decision-makers the real-world analysis they need to make smart choices.

The E2e Project is a joint initiative of the Energy Institute at Haas and MIT’s Center for Energy and Environmental Policy Research (CEEPR), an affiliate of MITEI — two recognized leaders in energy research.

The project’s name, E2e, captures its mission, the researchers say: to find the best way to go from using a large amount of energy (“E”) to a small amount of energy (“e”), by bringing together a range of experts — from engineers to economists — from MIT and UC Berkeley. This collaboration, the researchers say, uniquely positions the E2e Project to leverage cutting-edge scientific and economic insights on energy efficiency.

“Cutting energy has lots of potential to help us save money and fight climate change,” says Michael Greenstone, MIT’s 3M Professor of Environmental Economics and a member of MITEI’s Energy Council. “It’s critical to find the local, national and global policies with the biggest bang for the buck to use governments’, industry’s and consumers’ money wisely while slowing climate change.” 

Greenstone is leading the project with Christopher Knittel, co-director of CEEPR, and Catherine Wolfram, associate professor and co-director of the Energy Institute at Haas.

“When deciding on the best energy measures to implement, decision-makers should compare model predictions to actual consumer behaviors. That’s where this project comes in,” Wolfram says. “The E2e Project is focused on singling out the best products and approaches by using real experiments centered on real buying habits. It will provide valuable guidance to government and industry leaders, as well as consumers.”

The group’s motivations for studying energy efficiency are derived, in part, from the McKinsey Curve — a cost curve that shows that abating emissions actually pays for itself.

“Our goal is to better understand what the costs and benefits of energy-efficient investments are — where the low-hanging fruit is, as well as how high that fruit is up the tree,” says Knittel, MIT's William Barton Rogers Professor of Energy Economics at the MIT Sloan School of Management. “The McKinsey curve would suggest the fruit’s already on the ground. If this is true, we want to figure out why no one is picking it up.”

Former U.S. Secretary of State George P. Shultz, a member of the E2e advisory board, says, “I like the saying ‘A penny saved is a penny earned,’ which rings true from the standpoint of energy. Energy that is used efficiently not only reduces costs, but is also the cleanest energy around. The E2e Project will allow us to better understand which energy-efficiency programs save the most pennies.”

Shultz is a distinguished fellow at Stanford University’s Hoover Institution, where he leads the Energy Policy Task Force. The board also includes MIT Institute Professor John Deutch, former undersecretary of the Department of Energy; Cass Sunstein, a professor at Harvard Law School and President Obama’s former director of regulatory affairs; Susan Tierney, managing principal at Analysis Group and a former Department of Energy official; and Dan Yates, CEO and founder of Opower.                                                           

The E2e Project seeks to answer questions such as: Are consumers and businesses bypassing profitable opportunities to reduce their energy consumption? What are the most effective ways to encourage individuals and businesses to invest in energy efficiency? Are current energy-efficiency programs providing the most savings?

The project’s first experiments are already underway. For example, the team is tracking consumers’ vehicle purchasing decisions to discover if better information about a car’s fuel economy will influence consumers to buy more fuel-efficient vehicles. If so, emphasizing the calculated fuel savings in the vehicle information presented to consumers may be productive. 

Other initial projects include evaluating the Federal Weatherization Assistance Program, and determining why households invest in energy efficiency and the returns to those investments.

More information: e2e.haas.berkeley.edu or e2e.mit.edu

The E2e Project was funded with a grant from the Alfred P. Sloan Foundation.

energy efficiency
In The News
Washington Post: Wonk Blog

It’s something we hear from policymakers again and again: The world squanders too much energy. And wringing out that waste should be one of the easiest ways for the United States and other countries to save money and curb pollution.

But as it turns out, much of what we know about the topic of energy-efficiency is still fairly hazy. Sure, it’s technically doable to make cars more fuel-efficient or insulate homes to prevent heat from leaking out. But which of these efforts are really the most cost-effective? And if it’s such a no-brainer, why aren’t people already taking these steps?

The fact that we still don’t have great answers to those questions is what inspired a group of economists at MIT and the University of California, Berkeley to launch a big new project, called E2e, that will try to apply more scientific rigor to the whole topic of energy efficiency.

“Almost all of the previous work on energy efficiency comes from engineering studies, which look at what’s possible under ideal conditions,” says Michael Greenstone, an economist at MIT and co-director of the E2e project. “We wanted to ask a slightly different question — what are the actual returns you could expect in the real world?”

Here’s what he means. In 2009, McKinsey & Co. released an eye-popping study demonstrating that the United States could hugely improve the efficiency of its homes, offices and factories, through strategies like sealing leaky building ducts and upgrading old appliances. By doing so, McKinsey estimated, the country could save $680 billion dollars over 10 years and do the climate equivalent of taking all the nation’s cars off the road.

Yet as economists scrutinized those numbers, they realized the picture is more complex. ”Those engineering studies can’t account for the behavioral changes you might see in response to efficiency improvements,” says MIT’s Christopher Knittel, who also co-directs the E2e project. “People could, for instance, start adjusting their thermostat if it becomes cheaper to cool the house.” (This is known as the “rebound effect.”)

Ideally, says Knittel, researchers would start conducting rigorous, randomized controlled trials to find out precisely how effective various efficiency policies are. The E2e  Web site lists some of the detailed work that has been done on this front — though there aren’t many such studies.

One recent study of Mexico, for instance, found that a government program to help people to upgrade their refrigerators with energy-saving models really did curtail electricity use. However, a similar program for air conditioners had the opposite effect — when people got sleeker A/C units, they used them more often, and energy use went up.

“The point is that policymakers aren’t going to spend an infinite amount of money trying to save energy or reduce greenhouse gases,” Greenstone says. “So the motivation is to find the places where the return is the greatest. If you could reduce a ton of carbon-dioxide for $100 or two tons for $50, you’d choose the latter.”

The researchers are also asking why, if it’s so compelling, people and businesses don’t already take steps to become more energy efficient. Is it because people aren’t aware that they can? Are there actual market barriers that could be addressed by policy? (For instance, landlords may have little incentive to invest in energy-saving appliances for their tenants.) Or is it just that the purported savings aren’t worth it in the first place?

“It’s easy to come up with conjectures for why people aren’t choosing more efficient options,” says Catherine Wolfram, an economist at the Energy Institute of Haas in Berkeley. “Maybe people don’t have the right information, maybe people are procrastinating. But right now, these are just stories. It’s an area where we need more evidence.”

Some work is being done on this front. Knittel, for instance, is conducting an experiment to see whether people will buy more fuel-efficient cars if they simply receive more detailed information about gasoline costs and mileage. Greenstone and Wolfram are carrying out a randomized controlled trial to scrutinize a U.S. government program to help weather-proof the homes of low-income people.

“Part of the reason we started this project is that efficiency is one of the few areas where there’s broad agreement across the political spectrum that these are policies we should be pursuing,” Greenstone says. “And we want to be able to show what actually works and what doesn’t.”

nuclear power
News Release
MIT News

June 14, 2013
Alli Gold
MIT Joint Program on the Science and Policy of Global Change 

After the 2011 Fukushima nuclear disaster, energy experts and policymakers around the world began to reassess the future of nuclear power. Countries, including Japan and Germany, have since scaled back or plan to shut down their nuclear power — sparking a global debate on how nations will replace nuclear.

Taiwan is just one country where this intense debate is unfolding. Yen-Heng Henry Chen, a Taiwan native and research scientist at MIT’s Joint Program on the Science and Policy of Global Change, decided to look at how the nation’s economy and emissions reduction strategies might be affected by future changes to Taiwanese nuclear energy policies.

“There has been little research on the interactions between non-nuclear and low-carbon policies,” Chen says. “Taiwan has a small economy and limited natural resources, making it an interesting case study for other countries looking for ways to cut carbon emissions with or without nuclear power.”

The Taiwanese government aims to cut its CO2 emissions in half (from 2000 levels) by 2050. One way they had planned to do this was through nuclear power. Taiwan currently has three nuclear power plants, with plans to bring a fourth plant, the Longmen Nuclear Power Station, online in 2015. This tightly populated country has more than nine million residents within 50 miles of its three existing nuclear reactors. Because Taiwan is similar in topography and fault lines to Japan, the prospect of the new plant — and perhaps others to come — has raised public concerns about the safety of nuclear power.

“After the Fukushima accident, more than 60 percent of the Taiwanese population was against the construction of a new nuclear power plant according to a recent poll,”  Chen says. “I wanted to know what it would mean for the Taiwanese economy and the government’s emissions reduction targets if they were to eliminate or reduce nuclear power.”

Taiwan currently imports 99 percent of its energy, which includes oil, natural gas, coal and nuclear. Because the opportunities for alternative low-carbon energies such as solar, wind and hydro are limited, Chen conducted an economy-wide analysis that explored other ways to reduce carbon emissions: nuclear power, a carbon tax, and carbon capture and storage (CCS) technology.

When implementing a low-carbon and non-nuclear policy, without the availability of CCS (which is not yet cost-effective at a large scale), Chen finds that by 2050 GDP would drop by about 20 percent. If CCS were to become more cost-effective and could be added to the low-carbon strategy, GDP would drop by less than 10 percent. But the least expensive way to pursue a low-carbon policy, Chen finds, would be to expand nuclear capacity in addition to adopting CCS. If nuclear capacity was tripled (compared to current levels) and CCS option was feasible, by 2050 GDP loss would be reduced to around five percent.

Absent nuclear power and CCS, “Taiwan needs to convert its industrial structure into a much less energy intensive one if the country is serious about achieving a low-carbon environment,” Chen says. Taiwan’s industrial sector accounts for almost half of the country’s energy demands.

Costs could be lowered for industry and consumers if Taiwan were able to join an international emissions trading system — which Chen looks forward to exploring further in future research.

Until such an international trading system exists, “This case study can help policymakers better understand the costs of cutting CO2 emissions without nuclear energy,” Chen says, “as nuclear power becomes a less viable energy solution in Taiwan and around the world.”

ethiopia
News Release
MIT News

June 13, 2013
Alli Gold
MIT Joint Program on the Science and Policy of Global Change

If you know how much something costs, you can budget and plan ahead. With this in mind, a team of researchers from MIT, the World Bank and the International Food Policy Research Institute recently developed a country-level method of estimating the impacts of climate change and the costs of adaptation. This new method models sector-wide and economy-wide estimates to help policymakers prepare and plan for the future.

"Previous country-level research assessing climate change impacts and adaptation either focused on economy-wide estimates or sector-by-sector analysis, without looking at the bigger picture," says Kenneth Strzepek, one of the lead authors of the study and a research scientist at MIT's Joint Program on the Science and Policy of Global Change. "By looking at the interplay between different sectors and within the economy, we are able to evaluate the indirect effects and interactions that can occur that are often not captured."

As a case study, the researchers apply their technique to Ethiopia — the second most populated country in Sub-Saharan Africa. They look at three key sectors: agriculture, road infrastructure and hydropower.

"These sectors were selected because of their strategic role in the country's current economic structure and its future development plans," Strzepek says.

Agriculture accounts for about 46 percent of the GDP in Ethiopia and is almost entirely rain-fed. Variability in temperature and rainfall will have major impacts on this crucial industry. The researchers found that with a temperature increase of two degrees Celsius, more intense drought and floods will cause a drop in crop production — triggering reductions in income, employment and investments.

Frequent and intense flooding will also damage Ethiopia's road infrastructure — the backbone of the country's transportation system and a needed link in the agricultural supply chain. The researchers found that flooding brought on by climate change will increase maintenance costs by as much as $14 million per year for the existing road network, which is expected to grow dramatically in the next 40 years.

The intense variability of precipitation will also greatly impact the country's hydropower and associated reservoir storage, which could provide energy, irrigation and flood mitigation. Because there is currently little installed hydro capacity in Ethiopia, the model showed few climate change impacts. But in the coming years, the government plans to invest heavily in this sector, meaning there could potentially be significant impacts to this sector as well.

Additionally, the researchers found that there would be an increased demand for water across sectors and create challenges for policymakers to effectively distribute this important resource. For example, Ethiopia plans to expand irrigated agriculture by 30 percent by 2050. The researchers found that some of the irrigation demands will be unmet, placing demands on other sectors requiring water resources.

"This research makes clear the impact droughts, floods, and other effects brought on by climate change can have on major financial sectors and infrastructure," Strzepek says. "For Ethiopia, we find that one of the best defenses against climate change is investment in infrastructure for transportation, energy and agriculture. By building up these sectors, the government will be able to enhance the country's resiliency."

He continued, "In predicting the outcomes of future water, infrastructure and agriculture projects, we were able to test the effectiveness of policies. This gives decision-makers in these countries, as well as international organizations, the information they need to continue to grow, develop and plan for the future with climate change in mind."

Planning for climate change is essential, Raffaello Cervigni, a co-author of the study and lead environmental economist at the World Bank, writes in a recent blog post.

"Addressing climate change is first and foremost a development priority for Africa … If no action is taken to adapt to climate change, it threatens to dissipate the gains made by many African countries in terms of economic growth and poverty reduction over the past ten years," he writes.

But, he continues, "a harsher climate need not be an impediment for Africa's development," if we can come together to address these challenges.

The integrated approach used by the authors is now being applied to studies on the costs of adapting to climate change in Ghana and Mozambique, as well as Vietnam. Others have replicated the approach to help other countries calculate the costs of adaptation.

Reprint 2013-7

A Slice of MIT

By Joe McGonegal

June 12, 2013 

On the slopes of Mt. Karisimbi, a 4,500-meter volcano in northwestern Rwanda, a lone MIT researcher is working this year to add new data to climate change research.

She is Katherine Potter PhD ’11, the principal investigator for the new Rwanda Climate Observatory. Working in the same area where iconic zoologist Dian Fossey studied mountain gorillas a half-century ago, Potter works just as passionately towards her goal: to empower Rwandans in becoming part of climate change research and to get Africa on the climate-change grid.

If Potter is successful, the observatory atop Mt. Karisimbi will join the Advanced Global Atmospheric Gases Experiment (AGAGE), a worldwide program funded in part by NASA and NOAA that captures climate data.

AGAGE began in 1978 and now includes eight observatories around the world that record air pollution and greenhouse gas emissions. It is a leading source of data for measuring progress against the 1987 Montreal Protocol and 1997 Kyoto Protocol benchmarks for carbon emissions. 

Until now, Africa has not had an observatory feeding into AGAGE’s experiment. Covering a fifth of the world’s land mass, this is no small piece of lost data. 

Potter hopes to fix that. Working for MIT’s Center for Global Change Science, Potter is training future Rwandan scientists, technicians, and academics to collaborate in the world’s efforts to monitor climate change.

Mt. Karisimbi is a perfect place for the observatory, says Potter, who blogs about her progress. “At 4,500 meters, the air reaching the station will come from a large area, getting info from much of Africa and the surrounding oceans,” she says. “Also, it shares a border with Congo and is in the same protected area that continues into Uganda. So this is unifying the East African community in doing climate research.”

Potter’s work is a result of a conversation Rwandan president Paul Kagame began with then president Susan Hockfield in 2008. Kagame was on campus for the Compton Lecture. CGCS director Ronald Prinn ScD ’71 and geophysics professor Maria Zuber have led MIT’s efforts to develop the project since.

The project has inspired other alums, like Jonathan Goldstein ’83. “My wife Kaia Miller Goldstein and I have worked with both the Rwandan and MIT leadership,” Goldstein says. “It has been exciting to see them collaborate on this worthy project.  We were thrilled to meet [Potter] while in Rwanda recently. She is a real star.” 

“I think the real joy for those involved comes from the cultural collaboration, where MIT scientists can really make a difference in the world and the Rwandan people can show the world that they are rapidly advancing as a society,” says Goldstein.

MIT is one of ten universities that participate in AGAGE, a venture jointly funded by British, American, and Australian government agencies. AGAGE instruments around the world measure and report on the atmospheric levels of 33 compounds.

Potter is collaborating with the Ministry of Education in Rwanda, which is recruiting top academics and analysts from within its borders to participate. The Rwandan government is also planning to construct an €18-million cable car up Karisimbi, in the hopes that the station becomes a tourist destination, too. Potter estimates that the observatory will be complete and staffed by Rwandan scientists in the next three or four years.

Lermusiaux
Around Campus
MIT News

"When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.”

A version of this quote, originally penned by Sir Arthur Conan Doyle in “The Case-Book of Sherlock Holmes,” appears in a dog-eared copy of “Advanced Mathematical Methods for Scientists and Engineers” on a shelf in Pierre Lermusiaux’s office. The textbook, which he has kept since he was an engineering undergraduate in Belgium, introduces each chapter with a quote from the fictional sleuth — a literary prompt that pushed Lermusiaux, as a young student, to keep reading.

The quote above is particularly apt for Lermusiaux, who has devoted his research, in part, to eliminating unlikely scenarios in ocean dynamics.

Lermusiaux leads MIT’s Multidisciplinary Simulation, Estimation, and Assimilation Systems (MSEAS) group, which develops models and assimilation schemes to better predict ocean behavior for a wide range of applications — from planning the most efficient paths for underwater robots to anticipating how bioluminescent organisms will affect sonar propagation.

The group focuses, in part, on modeling coastal areas, which Lermusiaux describes as a veritable sea of complexity.

“In coastal areas, things can get more mixed up than in the open ocean,” says Lermusiaux, an associate professor in the Department of Mechanical Engineering. “You have fronts and eddies, currents and jets, and the effects of winds, the seabed and the Earth’s rotation. There is a lot of coastal ocean in the world, and it’s very dynamic."

Working hard for fun

The concept of fluid dynamics was of early interest to Lermusiaux, who remembers learning of the Coriolis effect — the inertial force created by the Earth’s rotation — in a high school geography class.

“The teacher started explaining with an apple, and I still vividly remember that part, and thought it was fascinating how these forces would appear,” he recalls.

Lermusiaux grew up in Liège, Belgium, in a family of scientists. His father is a nuclear engineer, his mother a geography professor, and his sister an architect. The family often went along on his mother’s field trips, and took countless vacation detours to visit natural sites and manmade systems, including old ruins and architectural relics, following the family mantra: “It needs to be seen."

His father comes from a long line of farmers, dating back five generations — a lineage that may have rubbed off on Lermusiaux, who spent many of his weekends and holidays working at a local farm with a friend.

“We’d get up very early in the morning, and they’d do a very good breakfast of eggs and bacon, and you were almost like a son of the family,” Lermusiaux says. “We’d show up, work very hard, and we’d stink by the end of the day. But it didn’t seem like work to us — it was fun.”

When it came time to decide on a path after graduating with an undergraduate degree in mechanical engineering from Liège University, Lermusiaux recalls broaching the subject of graduate studies abroad over the dinner table. Not long after, he headed across the Atlantic to Harvard University to pursue a PhD in engineering science.

Going coastal

For his thesis, Lermusiaux worked to whittle down the uncertainty in ocean modeling. At the time, ocean data were relatively limited, and samples came with some uncertainty. As a result, approximate models initialized using that fuzzy data could lead to widely varying predictions. Lermusiaux looked for ways to characterize and predict uncertainty, and for ways to combine models with multiple data sets to reduce this uncertainty. He developed a data-assimilation method and computational schemes that produced better estimates of, and furthered understanding of, ocean dynamics. His work came at a pivotal time in ocean engineering. 



“It was the end of the Cold War, and people were looking less at the deep ocean, and moving toward the coast,” Lermusiaux says. “It was the beginning of trying to resolve the multiple scales and the motions in the ocean that matter, as realistically as possible.”



During his time at Harvard, Lermusiaux’s work occasionally took him out to sea. On one sampling expedition, he spent three weeks aboard a NATO ship near the Faroe Islands, halfway between Norway and Iceland. The region sits along the Iceland-Faroe Ridge, where warm currents from the Atlantic meet frigid waters from the Nordic seas. The interplay between the two water masses creates extremely powerful fronts that can deflect sonar signals. (The region, in fact, is a setting for the novel “Red October,” in which a Russian submarine evades detection by hiding in the turbulent waters.) Onboard the ship, Lermusiaux analyzed data collected during the cruise and found large-scale wave modes. 



Today, he says, much of this computational engineering work can be done remotely, thanks to the Internet. Researchers can download data directly from cruise servers, and perform analyses on more powerful computers in the lab. 




Eliminating the impossible



Lermusiaux set up his own lab at the end of 2006 when, after receiving his PhD from Harvard, he accepted a faculty position at MIT. Based in the ocean science and engineering section of MIT’s Department of Mechanical Engineering, his group carries out research in mechanics, computations and control. Specifically, his group has developed and applied new methods for multiscale modeling, uncertainty quantification, Bayesian data assimilation and the guidance of autonomous vehicles. 



A specific focus has been to answer questions involving nonlinearities and multiple scales. For example, the team is modeling the dynamic marine environment in Stellwagen Bank, at the mouth of Massachusetts Bay — a rich ecological web of life forms from plankton to whales. Lermusiaux’s group uses mathematical computations to model the relationship between physical and biological processes, aiming to understand how eddies, waves and currents enhance the region’s nutrient delivery and retention. 



The group has also been looking further out to sea to study multiscale dynamics at continental shelf breaks — boundaries at which the shallow ocean floor suddenly drops off, plunging thousands of feet and giving way to much deeper waters. 



“You have fronts between the shelf water and deeper water, and that’s an important region for exchanges,” Lermusiaux explains. “However, the multiscale interactions at shelf breaks are not well understood.” 



Recently, his group has characterized the multiscale variability of internal tides in the Middle Atlantic Bight shelf break. They showed how this internal tide variability can be caused by strong wind and by direct Gulf Stream interactions.



To allow such multiscale studies, Lermusiaux’s team has adapted new ideas in computational fluid dynamics. They are developing numerical models with variable resolutions in time and space, and have also created equations that predict uncertainty in large-scale ocean systems. They then developed nonlinear Bayesian data-assimilation methods that employ these uncertainty predictions. These methods can predict the likelihood of different scenarios and combine these scenarios with actual field observations in a rigorous Bayesian fashion.  



The researchers are also applying their models to the dynamic control and planning of swarms of autonomous underwater vehicles, or AUVs. Increasingly, these robots are used to sample and monitor the ocean for pollution, marine populations, energy applications, and security and naval operations. With his students, Lermusiaux is developing mathematical models to determine the most efficient paths for robots to take, maintaining coordination among robots along the way. For instance, if a current is likely to flow in a certain direction, a robot may want to simply ride the wave toward its destination.  



Lermusiaux’s group is also working on schemes that guide such sensing robots toward locations that provide the most useful undersea data. Similarly, the researchers have recently integrated their work into powerful new systems that can objectively rank competing ocean models, accounting for all uncertainties. 



The key to this kind of modeling, as with much of Lermusiaux’s work, is eliminating unlikely, or impossible, scenarios. For example, determining whether a vehicle should go left or right is a numerical process of elimination, depending on certain parameters like current speed and direction — an oversimplification, compared with the incredibly complex environment which he models.

"We have made advances in numerical schemes, uncertainty prediction, data assimilation and inference, which all have applications in many engineering and scientific fields,” Lermusiaux says. “The smarter you are in combining information with model simulations, the better you can be.”

 

NOAA
Oceans at MIT

The Keeling Curve record from the NOAA-operated Mauna Loa Observatory shows that the atmospheric carbon dioxide concentration hovers around 400 ppm, a level not seen in more than 3 million years when sea levels were as much as 80 feet higher than today. Virtually every media outlet reported the passage of this climate milestone, but we suspect there’s more to the story. Oceans at MIT’s Genevieve Wanucha interviewed Ron Prinn, Professor of Atmospheric Science in MIT’s Department of Earth, Atmospheric and Planetary Sciences. Prinn is the Director of MIT’s Center for Global Change Science (CGCS) and Co-Director of MIT’s Joint Program on the Science and Policy of Global Change (JPSPGC).

Prinn leads the Advanced Global Atmospheric Gases Experiment (AGAGE), an international project that continually measures the rates of change of the air concentrations of 50 trace gases involved in the greenhouse effect. He also works with the Integrated Global System Model, which couples economics, climate physics and chemistry, and land and ocean ecosystems, to estimate uncertainty in climate predictions and analyze proposed climate policies.

What is so significant about this 400-ppm reading?         
Credit: NOAA
This isn’t the first time that the reading of 400 parts per million (ppm) of atmospheric CO2 was obtained. It was recorded at a NOAA’s observatory station in Barrow, Alaska, in May 2012. But the recent 400-ppm reading at Mauna Loa, Hawaii got into the news because that station produced the famous “Keeling Curve,” which is the longest continuous record of CO2 in the world, going back to 1958.

‘400’ is just a round number. It’s more of a symbol than a true threshold of climate doom. The real issue is that CO2 keeps going up and up at about 2.1 ppm a year. Even though there was a global recession in which emissions were lower in most fully-developed countries, China, and to lesser extent India and Indonesia, blew right through and continued to increase their emissions.

Has anything gone unappreciated in the news coverage of this event?

Yes. What’s not appreciated is that there are a whole lot of other greenhouse gases (GHGs) that have fundamentally changed the composition of our atmosphere since pre-industrial times: methane, nitrous oxide, chlorofluorocarbons (CFCs), and hydrofluorocarbons. The screen of your laptop is probably manufactured in Taiwan, Japan, and Eastern China by a process that releases nitrogen trifluoride—release of 1 ton of nitrogen trifluoride is equivalent to 16,800 tons of CO2. But there is a fix to that—the contaminated air in the factory could be incinerated to destroy the nitrogen trifluoride before it’s released into the environment.

Many of these other gases are increasing percentage-wise faster than CO2 . In the Advanced Global Atmospheric Gases Experiment (AGAGE), we continuously measure  over 40 of these other GHGs in real time over the globe. If you convert these other GHGs into their equivalent amounts of CO2 that will have the same effect on climate, and add them to the NOAA measurements of CO2, you find that we are actually at 478 ppm of CO2 equivalents right now. In fact, we passed the 400 ppm back in about 1985. So, 478 not 400 is the real number to watch. That’s the number people should be talking about when it comes to climate change.

What has Advanced Global Atmospheric Gases Experiment (AGAGE) revealed about this greenhouse gas problem?

The non-CO2 GHGs are very powerful. One example is sulfur hexafluoride (SF6), which used to be in Nike shoes, and is now most widely used in the step-down transformers in long-distance electrical power grids. But SF6 leaks a lot, with 1 ton equivalent to 22,800 tons of CO2, and it’s increasing in our measurements. Another example is methane. We have been measuring methane for almost 30 years now, and it actually didn’t increase for almost 8 years from 1998 onwards, but we discovered in our network that it began to increase again in 2006. We published this finding in 2008, and ever since, methane has been rising at a rapid rate. Nitrous oxide, the third most important GHG, has been going up almost linearly since we started measuring it in 1978.

The worrisome thing is that almost all of these gases keep rising and, per ton, they are very powerful drivers of warming. Many of these GHGs have lifetimes of hundreds to thousands to tens of thousands of years, so they are essentially in our atmosphere forever. There is almost nothing practical we can do to vacuum these gases out again. 

Is it possible to decrease the atmospheric CO2?

One well-understood method of removing CO2 from the atmosphere is carbon sequestration, in which you remove the CO2 from the biomass burnt in an electrical utility, and then bury it in subsurface saline aquifers or in the deep ocean. There are people here at MIT, Rob van der Hilst and Brad Hager and others, who study the question of just how permanent is this deep burial on land.

Carbon sequestration can also lower CO2 emissions from coal-fired power plants. It looks like the Department of Energy will reactivate a couple of these projects in Wisconsin and Texas to better understand this technology, with the goal of lowering the emissions from power plants to say 10% or less of what they were.

At the end of the day, the smart thing would be not to resort to vacuuming CO2 out of the atmosphere and putting it down deep underground. It would be better to develop new and affordable zero- or very low-emission energy technologies such as biofuels, nuclear, solar and wind.

Will switching to ‘fracked’ natural gas reduce warming? 

Hydraulic fracturing in a vertical well. Credit: EPA

We have run our Integrated Global System Model presuming that hydraulically fractured gas from shale deposits in the US and elsewhere around the world could begin to be used at large scale. We’ve looked at the question of if we did convert all oil usage to fracked gas usage over the next 20-30 years, would it lower the rate of warming? And the answer is yes, because you get about twice as much energy per ton of CO2 emitted from burning methane as you get from coal.

There are some serious issues about the water used to pump down and split the shale. In the fracking process, trace chemicals are added into the water to make it slippery so the water can force itself in between the layers of shale. The problem is, shale is filled with mucky stuff such as salts and heavy organics, which all ends up in the frack water and comes back up to the surface. So what do you do with that very polluted water? Then there is the concern that the water could travel horizontally and vertically through the shale layers and end up in ground water. And that’s an environmental issue that has to be addressed.

However, chemical companies are already investing in technologies that can take the frack water that’s pumped back out and literally clean out the hydrocarbons and re-use it again for fracking. So, there is an answer to the frack water problem, but there must be a strong push to make sure fracking is environmentally sound.

We did find that if you increase the use of fracked gas and didn’t repair the existing natural gas pipelines, they could leak several percent of the transferred volume because it’s old city and intercity infrastructure. It’s leaking now in all major pipeline systems in the US and Europe, which is a problem because the leaked methane is a much more powerful GHG per ton than CO2. So, repairing or replacing old gas pipelines will be a big requirement.

Addressing all these environmental concerns will add somewhat to the cost of energy. But most who study the climate issue in detail and in depth understand that the damages that are going to result from continued warming will far exceed the cost of any policy that we put together to lower GHG emissions. Yet, as you know, the politics of climate in Washington is impossible right now because a minority of senators can block any legislation. It doesn’t look like anything will happen soon on a national emissions reduction policy. Politics trumps science on these issues. But the EPA has the power to treat CO2 as an air pollutant so maybe that’s what will happen near term.

The bottom line is if we switched from using oil and coal globally to running everything on shale gas, there probably is enough gas there. But with this alone, you would still get about a 3.5 C warming by 2100. With no policy at all, our model estimates a 5-degree or higher warming. So replacing coal and oil with fracked gas is a sensible pathway for the US to go over the next few decades, with the additional advantage of gaining more energy independence. But it won’t remove  the global warming threat beyond that.

What are the implications of the 478-ppm measurement to human life?

According to the paleoclimatological ice core record, if our planet warms more than 2 C globally (4 C at the poles), we are in trouble. That’s about 6 meters or 20 feet of sea level rise. Most of the world’s valuable infrastructure and high populations are along the coasts. So, the damage and cost of sea level rise alone is potentially very high. Other risky phenomena we face are shifting rainfall patterns that may move the locations of arable farmland out of the US and into Canada. Mexico could grow drier and drier, and there’s concern in the Department of Defense about potential challenges to the security at the southern US border. Other similarly vulnerable areas around the world could face desperate large-scale migrations of people seeking to find places to grow food.

These damages are likely to exceed significantly the costs associated with an efficient and fair GHG policy such as an emission tax whose revenues are used to offset income taxes.

Jessika Trancik
News Release
MITEI

June 6, 2013
Vicki Ekstrom, MIT Energy Initiative

The cost and performance of future energy technologies will largely determine to what degree nations are able to reduce the effects of climate change. In a paper released today in Environmental Science & Technology, MIT researchers demonstrate a new approach to help engineers, policymakers and investors think ahead about the types of technologies needed to meet climate goals.

“To reach climate goals, it is important to determine aspirational performance targets for energy technologies currently in development,” says Jessika Trancik, the lead author of the study and an assistant professor of engineering systems. “These targets can guide efforts and hopefully accelerate technological improvement.”

Trancik says that existing climate change mitigation models aren’t suited to provide this information, noting, “This research fills a gap by focusing on technology performance as a mitigation lever and providing a way to compare the dynamic performance of individual energy technologies to climate goals. This provides meaningful targets for engineers in the lab, as well as policymakers looking to create low-carbon policies and investors who need to know where their money can best be spent.”

The model compares the carbon intensity and costs of technologies to emission reduction goals, and maps the position of the technologies on a cost and carbon trade-off curve to evaluate how that position changes over time.

According to Nathan E. Hultman, director of Environmental and Energy Policy Programs at the University of Maryland’s School of Public Policy, this approach “provides an interesting and useful alternate method of thinking about both the outcomes and the feasibility of a global transition to a low-carbon energy system.” Hultman, who is also a fellow at the Brookings Institution, was not associated with the study.

How do technologies measure up?

According to Trancik, the cost and carbon trade-off curve can be applied to any region and any sector over any period of time to evaluate energy technologies against climate goals.  Along with her co-author, MIT master’s student Daniel Cross-Call, she models the period from 2030 to 2050 and specifically studies the U.S. and China’s electricity sectors.

The researchers find that while major demand-side improvements in energy efficiency will buy some time, the U.S. will need to transition at least 70 percent of its energy to carbon-free technologies by 2050 – even if energy demand is low and the emissions reduction target is high.

Demand-side changes buy more time in China. Efficiency, combined with less stringent emissions allocations, allows for one to two more decades of time to transition to carbon-free technologies. During this time, technologies are expected to improve.

This technology focused perspective, Trancik says, “may help developed and developing countries move past the current impasse in climate negotiations.”

While reaching climate goals is a seemingly formidable task, Trancik says that considering changes to technology performance over time is important. When comparing historical changes in technologies to the future changes needed to meet climate targets, the results paint an optimistic picture.

“Past changes in the cost and carbon curve are comparable to the future changes required to reach carbon intensity targets,” Trancik says. “Along both the cost and carbon axes there is a technology that has changed in the past as much as, or more than, the change needed in the future to reach the carbon intensity and associated cost targets. This is good news.”

The research was partially funded by the MIT Energy Initiative.

gas power
In The News
MIT Tech Review

Last week, the new U.S. secretary of energy, Ernest Moniz, pledged to continue his predecessor’s work in making the Department of Energy a “center of innovation,” while also highlighting projects he thought deserved more attention. Near the top of his list is a renewed emphasis on carbon dioxide capture and storage (CCS), a technology that could prove vital to combating climate change, but is developing far too slowly, according to the International Energy Agency.

solar power
News Release
MITEI


Vicki Ekstrom, MIT Energy Initiative

In the past decade, the massive expansion of China’s production and export of silicon photovoltaic (PV) cells and panels has cratered the price of those items globally, creating tension between China and the United States, and, more recently, China and the European Union. In a new study (see PDF), MIT researchers explain why these tensions could harm the broader solar industry and have spiraling effects for China-U.S. trade relations.

“China and the U.S., and China and the E.U., are in the midst of a blame game as the solar industry is on the brink of collapse — and the tensions could infect technology and commercial development globally,” says John Deutch, the lead author of the study and Institute Professor at MIT. “All the players should understand that the PV industry is globally linked, and jobs and profits are available for those who manufacture and for those who innovate. Given the complex but productive relationships, nations need to find a way to better work together rather than flirt with protectionist measures.”

Over the last decade, manufacturing of PV cells and panels expanded in China, boosting supply globally. The flood of solar panels, combined with a slipping subsidized demand for solar energy (especially in Europe), lowered the global market price to unsustainable levels, the study shows. Between 2009 and 2012, the price of crystalline silicon panels decreased from more than $2.50 per watt to less than $1 per watt, as China supplied 30 to 50 percent of U.S. PV imports.

The result? PV manufacturers globally haven’t been able to compete, Deutch and the study’s co-author, MIT professor of political science Edward Steinfeld, explain. In response, the U.S. Department of Commerce and the International Trade Commission imposed substantial anti-dumping duties — tariffs imposed on low-priced foreign imports — on some Chinese manufacturers last November, following complaints from U.S. PV manufacturers who alleged that the Chinese were selling their products below fair market value. Around the same time, Europe issued an anti-dumping inquiry; it has also threatened to announce tariffs by June 6.

China has responded with its own allegations, also threatening to issue tariffs — this time, on the materials and technology imported to make the panels. Many of those imports come from the United States. China threatens these tariffs as its PV industry also faces trouble, according to the study: Net margins of panel suppliers in China fell to double-digit negative values in 2011 and remain there now, more than a year later. The study reports that Suntech Power Holdings — the largest PV-panel maker in the world — posted a loss of $495 million in 2012. (The company declared bankruptcy at the end of March.)

Deutch and Steinfeld explain that the two nations — China and the United States — are interdependent and form a “potentially productive global ecosystem for innovation.” When one side declines — as is happening in China, with its PV manufacturers — so will the other side, as is happening in the United States, with its technology and manufacturing tools, the study says. The researchers explain that there are opportunities for the two nations to together accelerate the worldwide deployment of solar PV for electricity generation.

“The two countries have different strengths and weaknesses,” Deutch says. “The U.S. is creating the technology and manufacturing tools and China is successfully, but not profitably, manufacturing devices based on today’s technology. If both countries look at the big picture, choose to focus on their strengths, and put aside the blame game, they have a real opportunity to boost global deployment of solar.”

Arun Majumdar, the vice president for energy at Google and former director of the Department of Energy’s Advanced Research Projects Agency-Energy, says, “Deutch and Steinfeld’s factual and data-driven analysis shows that in the interdependent global market and supply chain of the solar industry, policies of individual governments that foster and leverage their domestic strengths to openly and fairly compete in the global market are better off in the long term to reach national goals of economic growth.”

Majumdar adds, “On the other hand, policies that distort the market via undue protectionism or disproportional investments to reach national goals could backfire and produce opposite outcomes.”

The study will become part of a larger report on the “Future of Solar,” to be released by the MIT Energy Initiative at a later date.