News + Media

Recent Event

Chinese policymakers, senior academics, and more than 100 researchers, scientists and industry leaders gathered last week for the Second Annual Stakeholders Meeting of the Tsinghua-MIT China Energy and Climate Project (CECP).  At the yearly gathering, participants reflected on the state of climate policy in China and the progress of the multi-disciplinary partnership, which launched last year to develop new tools to solve China’s most challenging climate and energy policy questions.

“In light of the recent agreement between Presidents Obama and Xi to limit hydrofluorocarbons—a potent greenhouse gas—we hope that close work between the two countries continues,” said Henry Jacoby, co-director emeritus of the MIT Joint Program on the Science and Policy of Global Change, during his keynote address. “In this context, the work of the CECP becomes ever more important.”

Jointly hosted by CECP’s parent research groups—the Tsinghua University Institute for Energy, Environment, and Economy and the MIT Joint Program on the Science and Policy of Global Change—the meeting creates a platform for a diverse group of policymakers to interact with researchers and explore future paths for China’s energy and climate policy. The number of external attendees more than quadrupled from last year’s conference, indicating the high level of interest in the Tsinghua-MIT collaboration and China's energy and climate policy more broadly. Senior officials from China's National Development and Reform Commission, National Energy Administration, Ministry of Industry and Information Technology, and Ministry of Science and Technology, as well as leading Chinese academics, formed a panel of experts that responded to the findings of the joint research team. Over 150 stakeholders representing industries, governments, and academic institutions in China and abroad attended the meeting, reflecting CECP's goal of sharing project insights with a broad range of global leaders on energy and climate topics.

The meeting’s main dialogue between CECP researchers and policymakers focused on future drivers of energy use and the design of a carbon emissions trading schemes (ETS) in China, a subset of the CECP’s ongoing work. CECP researchers compared China’s current climate policy—provincial carbon intensity targets—to national emissions trading system designs that varied in terms of sector and regional coverage. CECP researchers underscored the need for broad sector and geographic coverage to enhance ETS cost effectiveness, as well as the potential to achieve equity goals through the initial allocation of emissions permits.

In the afternoon, a panel of policy advisors representing planned pilot emissions trading systems in Beijing, Guangdong, Shanghai, Tianjin, and Hubei described the design and progress toward implementation, which is expected to be complete by the end of 2013. Panel participants emphasized that pilot schemes build familiarity with emissions trading and help policymakers evaluate the feasibility of a national ETS.

The CECP’s leaders, Dr. Valerie Karplus of MIT and Professor ZHANG Xiliang of Tsinghua, explained these findings based on two models they developed over the last year: the China-Global Energy Model (C-GEM) and the China-Regional Energy Model (C-REM). While the C-REM model allowed the researchers to uncover their ETS findings, the C-GEM model provided analysis on China’s "economic transformation"— the effort to move from an energy and emissions intensive economy focused on manufacturing for export to one that is more services and technology oriented.

Prof. ZHANG highlighted the importance of the MIT-Tsinghua relationship in bringing about these results. “The regular exchange of Tsinghua students working at MIT, and MIT students and researchers visiting Tsinghua, makes for a very productive working relationship,” Prof. ZHANG said, acknowledging the support of sponsors on both the MIT and Tsinghua sides. MIT founding sponsors include French Development Agency, Eni, ICF International (a consultancy), and Shell, while the collaboration receives support at Tsinghua from the Ministry of Science and Technology, National Development and Reform Commission, and the National Energy Administration.

Alongside government representatives, a number of senior academics from China's top universities in a variety of disciplines offered their input on the CECP's ongoing research efforts. The experts identified key ramifications from the results and called attention to future topics of interest.

“An important goal of this meeting is to bring the key stakeholders together in the same room,” said Karplus, “This helps to foster a shared awareness of the wide range of views on policy options that reflect the diverse circumstances facing China’s localities and industries.”

 

electricity meter
In The News
Energywire

By: Joel Kirkland and Peter Behr

June 26, 2013

President Obama's plan to use a mixture of mandates and flexible regulation to cut greenhouse gas emissions is being viewed by energy industry experts through an age-old axiom: The devil is in the details.

The plan appears to be a shot in the arm for natural gas, as Obama's proposed regulation of carbon dioxide emissions from existing coal-fired power plants would provide a boost to cleaner-burning gas generators.

The White House's second-term climate agenda faces daunting head winds. Opposition from Republicans and resistance from coal-state Democrats, and the nuts and bolts of crafting a policy that would mark a significant shift for the nation's electric power industry, are immediate hurdles. Court challenges could create a protracted process for regulating carbon, as it has for U.S. EPA regulations of toxic air pollutants pushed through by the Obama administration.

But the daily stir of coal, natural gas and electricity markets in the United States also matters. With details of the new carbon policy still to come, unpredictable market prices will continue to be an underlying driver of decisions by electricity producers about where to invest and what energy sources to dispatch. Abundant coal will battle it out with growing shale gas production, several experts interviewed by EnergyWire predicted.

"There is no disagreement that the environmental regulations, many of which were proposed during the Bush administration, are going to bind and cause several -- tens of gigawatts -- of coal plants to be uneconomic," said Jay Apt, director of the Electricity Industry Center at Carnegie Mellon University. "If gas prices stay reasonable, then people will be buying gas plants. 

Obama rolled out his climate plan in the blistering summer heat. Yesterday's speech on a Georgetown University quad rested on the premise that rising concentrations of heat-trapping gases in the atmosphere are the result of industrial emissions and that unregulated U.S. power sector emissions are contributing too much to rising temperatures.

The White House directive for EPA to begin drawing up a proposal to regulate existing coal-fired power plants had already been set in motion. In 2007, the Supreme Court ruled that EPA could not sidestep its authority to regulate emissions tied to climate change. The agency later issued an "endangerment finding" that created a legal foundation for regulating carbon. 

"We limit the amount of toxic chemicals like mercury and sulfur and arsenic in our air and water, but power plants can still dump unlimited amounts of carbon pollution into the air for free," Obama said in the speech. "That's not right, that's not safe, and it needs to stop."

Carbon economics

How you get there might be left up to the most unpredictable factor of them all: the economy.

"If you want to stabilize the concentration of carbon dioxide in the atmosphere, you need a substantial reduction of emissions," Apt said.

"Meeting greenhouse gas targets in a flat-growth economy, where the industrial use of electric power is the same now as it was in the early 1990s, is very different than a scenario in which you blithely say growth is going to be 3 percent a year," he said.

A range of factors affect the immediate future of electric power generation, said Metin Celebi, a principal with the Brattle Group. "The most important one is the gas price relative to the coal price," Celebi said.

The future direction of that price is anyone's guess, given how many questions remain about the pace of shale gas production.

Of the nation's approximately 300 GW of coal-burning power generation capacity, nearly 40 GW was targeted for retirement by 2016, according to a Brattle analysis.

More "lenient" regulatory controls, along the lines expected from the Obama administration, could cause that total to rise to nearly 60 GW, the Brattle report says. A very strict policy could raise that to 77 GW of coal plant capacity retirements.

Lower gas prices plus strict regulations could cause almost half of the U.S. coal fleet to retire, a scenario that Celebi and his colleagues concluded in an October 2012 report would likely be untenable for electricity producers.

This shift is occurring for both short-term price and longer-range policy reasons that are not easy to separate. Most energy companies expect that at some point, an explicit or implicit price will be placed on power plant carbon emissions, and that particularly burdens coal plants, whose carbon footprint is twice that of efficient gas generators, said Katherine Spector, executive director of commodities strategy for CIBC World Markets Corp.

That expectation affects companies' decision on retiring or retaining older, inefficient coal plants. "It's actually a combination of the two" -- price and policy -- "and the policy environment could significantly accelerate the trend," Spector said.

The drop in natural gas prices over the past two years has tilted production in gas's favor, particularly in competitive power markets where the two kinds of power plants seek low-bid opportunities to run hour by hour.

Until shale gas production flattened gas prices, coal held a solid lead. In April 2011, coal-fired plants accounted for 41 percent of electric power output, compared to 23 percent for natural gas. A year later, the two fuels' shares were almost identical. Then gas prices moved up again from less than $2 per thousand cubic feet to $4 recently, and coal made a comeback. Its share of electricity production was 38 percent this April compared to 26 percent for gas.

But an expectation of relatively cheap gas also affects decisions on the future of coal plants. "It is a different world in gas prices than we had three or four years ago," Celebi said. "Gas price projections have come down substantially, and that's something you can put more weight on."

The Energy Information Administration, the statistical arm of the Energy Department, noted in its 2013 annual energy outlook that "the interaction of fuel prices and environmental rules is a key factor in coal plant retirements."

For all the price and production scenarios EIA considered, less than 15 GW of new coal-fired capacity would be added between 2012 and 2040. "For new builds, natural gas and renewables generally are more competitive than coal, and concerns surrounding potential future GHG [greenhouse gas] legislation also dampen interest in new coal-fired capacity," EIA said.

Shutting one-third of coal

A 2012 paper by a team led by Massachusetts Institute of Technology researcher Henry Jacoby said the increased gas supply boosts the power industry's flexibility to meet baseload electricity demand if expectations about nuclear power don't pan out or coal retirements speed up.

Under an aggressive policy to slash carbon that requires a 50 percent emissions reduction below 2005 levels by 2050, there would have to be "substantial changes in energy technology."

If the development of shale gas became too expensive, gas use would "grow slightly for a few decades." Toward the end of that period, however, "it would be priced out of this use because of the combination of rising producer prices and the emissions penalty."

Renewable energy would grow to 29 percent of power demand, and coal would keep a substantial position in the power pie. 

Fact sheets issued by the administration yesterday left crucial questions about the policy plans unanswered, noted ClearView Energy Partners, "especially the threshold levels of emissions that will govern new and existing [generation] units."

On a back-of-the-envelope assessment, ClearView said that if the administration adopts a formula proposed by the Natural Resources Defense Council calling for a limit of 1,500 pounds of CO2 equivalent per megawatt-hour of electricity production, it could add another 70 GW of coal plant retirements by 2020. That's on top of the 40 GW of expected retirements tied to current EPA rules on mercury emissions and air toxins, the ClearView analysis said. 

"In the aggregate, this adds up to a shutdown of roughly one-third of U.S. coal-fired generating capacity within the space of a decade," ClearView said. "Program design matters, too. The inclusion of offsetting emissions reduction mechanisms could keep more coal online."

Republicans in Congress have accused the Obama administration of threatening grid reliability through its pressures on coal-fired generation.

Celebi said that remains a big question. "It depends on where and when those plants retire," he said. "If it's a long period of time, the markets will have a chance to respond by cutting consumption or adding other resources. 

"If it's too much, too quick, that will not be feasible," he said. 

State foot-dragging

Politics soured the policy debate starting in 2009, electricity demand declined as the economy slumped, and natural gas prices fell to record lows in 2012.

Meanwhile, the technology-driven onshore drilling boom has turned up a "bridge fuel" to cleaner forms of electricity. And the White House has been openly supporting gas's role in combating climate change and spurring economic growth, despite concerns about methane emissions -- a potent greenhouse gas -- tied to upstream gas production.

"Sometimes there are disputes about natural gas," Obama said yesterday. "But let me say this: We should strengthen our position as the top natural gas producer because, in the medium term at least, it not only can provide safe, cheap power, but it can also help reduce our carbon emissions."

For electric utilities, the latest White House climate plan comes nearly four years after a bruising period of political wrangling over carbon cap-and-trade legislation. In 2009, the Edison Electric Institute was mired in the details of a bill that would distribute emissions credits for power generators to buy and sell under a carbon pollution cap. EEI's Tom Kuhn, president of the trade group of investor-owned utilities, had put together a fragile coalition of companies that could support the approach to ratcheting down emissions.

The premise behind the coalition-building had been that it's better to have a "market-based" program shaped by Congress than to leave it to top-down EPA regulations under the existing Clean Air Act.

The House passed the cap-and-trade bill by a slim margin in the summer of 2009, a signature achievement for House Democrats, but one that rested on compromises too politically hot for the Senate. The divisive debate revved up an opposition campaign targeting the science and politics of climate change. Republican leaders used the defeated legislation as a cudgel in the 2010 elections.

Since then, EPA has continued to tighten rules around conventional pollutants. New plants will have to comply with Mercury and Air Toxics Standards (MATS) rules. Other regulations addressing water intake and cooling water discharge are also shaping utility industry plans. Conventional coal gradually is being forced out of power portfolios.

Also on the table is an EPA draft rule that would cap carbon emissions at 1,000 pounds per megawatt-hour of generation for newly built power plants. The standard, which encourages fuel-switching to gas, would put the kibosh on new coal-fired power plants.

In a prepared statement after the president's speech yesterday, EEI's Kuhn urged EPA to put in place measures that "contain achievable compliance limits and deadlines" and "are consistent with the industry's ongoing investments to transition to a cleaner generating fleet and enhanced electric grid." 

"It is also critical that fuel diversity and support for clean energy technologies be maintained, not hindered," Kuhn added.

The slowdown in electricity demand in the United States has helped enable efficiency technology and power plant retrofits to control pollution. But analysts and executives from powerful utilities like Georgia-based Southern Co. and Ohio-based American Electric Power Co. Inc. have said carbon limits pose the biggest risk to their coal fleet. 

State utility commissions responsible for regulating power plants could slow the shift to low-carbon standards, analysts said. 

"There will be litigation and foot-dragging on the part of some states in developing implementation plans," said Adele Morris, an energy economist at the Brookings Institution. "I think EPA is in the early stages of what they would even propose."

E2e
In The News
Bloomberg Businessweek

By: Elizabeth Rowe

June 25, 2013

We all know that air conditioning eats up an enormous amount of energy. We also know that installing ceiling fans would allow us to use the air conditioner a lot less. And we all know the savings over time would pay for the ceiling fan. So why aren’t there more people buying ceiling fans? 

That question, and many more like it, are at the center of a research project launched by the Haas School of Business at the University of California at Berkeley and MIT’s Center for Energy & Environmental Policy Research (CEEPR). The initiative, known as the E2e Project, will work to understand cost-effective ways to reduce energy use and the obstacles that sometimes get in the way.

Drawing on the skills of both engineers and economists from MIT and Berkeley, the project derives its name from its mission: finding a smart way to go from using a larger amount of energy, or “E,” to a smaller amount of energy, or “e.”

Much of the impetus for this project comes from the McKinsey Curve, a cost curve that asserts that there are “negative cost” energy efficiency investments that essentially pay for themselves. “There’s a fair bit of evidence out there that suggests there’s a lot of low-hanging fruit in terms of energy savings, but much of that evidence is based on engineering models,” says E2e co-director Christopher Knittel, a professor at MIT’s Sloan School of Management and CEEPR co-director. “Much of the engineering research ignores behavioral changes that might come in response to those investments, and those behavioral changes can manifest themselves in many ways.”

For example, Knittel says, households might turn their thermostat down in the summer or up in the winter if heating and cooling homes become more energy-efficient. Such behaviors, which reduce the benefits of energy efficiency, aren’t accounted for in engineering models, he says.

One study undertaken by E2e will determine how much energy the federal Weatherization Assistance Program saves. The project examines low-income households in Michigan that received free efficiency upgrades, such as insulation and weatherproofing, and audits their energy use over time to find out why actual efficiency gains are less than expected. Final results are expected later this year.

According to Knittel, E2e has three main objectives. One is to determine whether these “negative-cost” investments truly exist. The second is to understand which of these investments has the greatest return on investment. And third, E2e will try to understand why consumers and companies aren’t making these investments if they truly have a negative cost.

E2e’s co-director, Professor Catherine Wolfram, an associate professor at Haas and co-director of the Energy Institute, says E2e has a broader goal, too: “At the heart, I think we’re interested in finding the lowest-cost way to mitigate climate change.” She adds, “In the short term we hope to deliver to policymakers some really good information about where human behavior might influence energy efficiency technology and policy.”

National Academy
In The News
National Academy of Sciences

John Reilly, co-director of the Joint Program on Global Change, served on the committee responsible for a new National Research Council (NRC) report on the “Effects of U.S. Tax Policy on Greenhouse Gas Emissions.”

The report found that while tax policies can make a substantial contribution to meeting the nation's climate change objectives, the current approaches do not. In fact, current federal tax provisions have a minimal net effect on greenhouse gas emissions. While the report does not make any recommendations about specific changes to the tax code, it says that policies that target emissions directly, such a carbon tax or cap-and-trade system, would be the most effective and efficient ways of reducing greenhouse gases.

Reilly, with colleague Sebastian Rausch, authored a report last summer that demonstrated the benefits of a tax on carbon emissions, that could be part of a broader tax reform package.

“Congress will face many difficult tradeoffs in stimulating the economy and job growth while reducing the deficit,” said Reilly at the time the report was released. “But with the carbon tax there are virtually no serious tradeoffs. Our analysis shows the overall economy improves, taxes are lower and pollution emissions are reduced.”

The study — “Carbon Tax Revenue and the Budget Deficit: A Win-Win-Win Solution?”— calculated the impact a carbon tax starting at $20 per ton would have using a national economic model that details energy, taxes and household incomes. Reilly and his co-author Sebastian Rausch, now at ETH Zurich University, found that the tax would raise $1.5 trillion in revenue. That money could then be used to reduce personal or corporate income taxes, extend the payroll tax cut that expires this year, maintain spending on social programs—or some combination of these options—while reducing the deficit.

The NRC study came after Congress requested that a committee evaluate the most important tax provisions that affect carbon dioxide and other greenhouse gas emissions and estimate the magnitude of the effects.  The report considers both energy-related provisions — such as transportation fuel taxes, oil and gas depletion allowances, subsidies for ethanol, and tax credits for renewable energy — as well as broad-based provisions that may have indirect effects on emissions.

Reilly notes that his “win-win-win” study on carbon taxes showed that by “shifting the market through a tax on emissions rather than through tax credits for renewable sources, the nation would be raising revenue rather than spending it.”

Parts of the NRC’s media release were adapted for use in this news story.

To read more about the NRC’s report, click here.

To read more about the “Carbon Tax Revenue and the Budget Deficit: A Win-Win-Win Solution?” click here.  

Our optimism is rooted in the ingenuity of the world’s farmers.
Recent Event
Cargill

Creating a more food-secure world through adaptation and resilience

MIT Global Change Forum - Boston, Massachusetts

Greg Page, Cargill Chairman and Chief Executive Officer
June 4, 2013

(As prepared remarks)

I am very happy to be here this evening and glad that John Reilly extended the invitation. The theme for this year’s forum is “Water, Food and Energy in a Changing World.” In the past, I know these forums have focused on constrained resources like water and energy, so tonight we will talk more about food – which is the great natural combination of water and energy. 

My remarks this evening hopefully will be a good kickoff for two interesting days talking about the intersection of water, food and energy, which are more closely linked than ever before. 

There could be 9 billion people on this earth in 40 years….and we will feed them. How can we do it given the myriad factors – including climate – that are part of the food security puzzle? It's a challenge that will take our collective wisdom to solve. And it will require adaptive behaviors and resilience.

About Cargill

To give context for my remarks tonight, let me share just a few comments about Cargill.

Cargill is a company that began in the Midwest almost 150 years ago and has grown over time to where two-thirds of our employees are outside the U.S. In short, Cargill has globalized along with the world’s GDP.

Cargill operates in four key segments. The first is the business of taking food and crops from times and places of surplus, to times and places of deficit. That traditional role of Cargill in grains and primary oilseeds represents about 25 percent of the company.

The next segment is providing farmers with a variety of services and access to markets.

Third is our food and meat businesses. Our businesses here include cocoa and chocolate, malt, corn milling, flour, salad dressings, vegetable oil, and a fairly significant meat business.

Finally, Cargill has a risk management business. We trade ocean freight, coal, electricity, natural gas, petroleum, iron ore and basic metals. Clearly the prices of these commodities, particularly freight, petroleum and energy, have a dramatic impact on agriculture.

Cargill. Agriculture.

MIT’s Global Change program is one of several university programs Cargill funds that help to better understand climate science, impact on crop yields, sustainability and implications for food security.

Cargill and MIT

Cargill has been a sponsor of the MIT Joint Program on the Science and Policy of Global Change since 2008. I know this forum has gained an international reputation for serious and frank discussions of global issues.

MIT’s program is one of several University programs Cargill funds that are helping us better understand climate science, impact on crop yields, sustainability and implications for food security. (Others include Stanford’s Center for Food Security and the Environment, and the University of Minnesota’s Global Landscapes Initiative).    Cargill also has engaged with various think tanks to build our understanding of climate change issues, including Resources for the Future.   We also appreciate the work of IFPRI – the International Food Policy Research Institute – who is on the agenda tomorrow…and others in the nonprofit / IGO / academia space who are devoting their time and talent to the challenge of feeding the world.

Some of Cargill's most important philanthropic partnerships are with nonprofits like CARE, Feeding America, The Nature Conservancy, TechnoServe and others who are trying to make the world more food secure by either raising incomes, expanding access to food or ensuring that food production is done in an environmentally responsible way.

The complexity of food security

Of all the challenges facing our world today, none is more immediate than the need to provide sufficient nutrition for all. Food security involves interdependent parts, and having all those parts working together is what is complicated.

The globe’s population is not only increasing, it is becoming more urban and more affluent.  Our ability to meet that challenge is affected by these factors:

  • Diets are changing as income levels rise.
  • Biofuels have become a significant consumer of traditional crops.
  • Public investment in agricultural research has been declining.
  • Government policies that inhibit trade or limit productivity are affecting food availability and price.
  • And localized supply shocks and production shortfalls continue to occur – although in 2012 we had adequate production – even with the U.S. drought –we just didn’t share well.

Some people question whether we can grow enough food, especially in a world that needs to adapt to changes in climate.

Our optimism is rooted in the ingenuity of the world’s farmers.
Our optimism is rooted in the ingenuity of the world’s farmers. They are natural innovators - adapting to changes in the environment and technology.

Cargill is optimistic

At Cargill, we are optimistic. We believe that the world can feed itself and that we can harness the power of photosynthesis to produce all the nutrition needed for an increasingly prosperous world.

Our optimism is rooted in the ingenuity of the world’s farmers. They are natural innovators - adapting to changes in the environment and technology – proven by the doubling of the amount of grains, rice and oilseeds that they have produced since 1975—without a significant increase in acreage, much of that coming from double cropping.

I will try to show how the stakeholders in world food production are exhibiting the adaptive behaviors that underpin resilience and provide us a more food-secure world.

Can the food systems we rely on adapt?

If we are in a period of accelerated climate change, the question is whether the food systems upon which we rely can adapt.

Sometimes when we hear the word resilience applied to agriculture, we think of a hard-working and stoic farmer valiantly saving his crop from pests, drought and frost.  What I’m trying to convey is a much broader notion of “systemic resilience,” where all major stakeholders in the global food system are poised, able and willing to build solutions to broad challenges in an effective and most important, a complementary way. A global system that is sufficiently flexible to produce enough food despite localized disruptions.

The resilience of farmers

Let’s start where it all begins. On the farm.

While many people are working on climate change mitigation strategies….the farmer will be busy doing that and doing what he or she has always been doing: adapting! Clearly we need both adaptation and mitigation….it is not an either/or choice.

Farmers are the consummate optimizers. Every year, they look in their field, and look at what has been dealt to them. And -- in the best-run countries -- at the last minute, they make a decision about how to optimize their profitability – to grow what the market is signaling to the best of their ability. They look at input costs, forecasts, relative output prices, soil moisture. And then they plant.

There aren’t many of us that can come to work and turn on a dime as skillfully and naturally as farmers.  And while many of my examples today relate to modern Western agriculture, we also see resiliency in smallholder farmers. 

The power of price

To farmers, rising commodity prices are a potent fertilizer, motivating them to produce more when the market calls for it. In late March, the USDA predicted U.S. farmers would plant the most corn since 1936 – about 97 million acres. And 77 million acres will be planted with soybeans.  That acreage is predicted to deliver huge harvests. The predictions come with a caveat of course; they are predicated on a return to reasonable weather during the growing season.

If we take good prices to farmers, incredible progress can be made on food security. In developing countries it is an issue of the economic capacity of non-farmers to put enough price into their agricultural systems to create sustainable agriculture.

Growth in non-farm income is a precondition for agricultural development in emerging economies.  Climate, water, seed, technology and agronomy are all important. But the fundamental ingredient of sustainable agriculture is an adequate price to reward the farmer for her efforts.  Without the signaling power of price, there will be no change in farmer behavior.

Better technology at an accelerated rate

Because of recent high prices, farmers have had a run of prosperity and are gobbling up technology like never before. Better technology is coming into agriculture at an accelerated rate. And farmers are willing to invest. Cargill has a role in this, facilitating the transmission of price signals and bringing farmers technology and risk management options so they can make decisions that maximize their profitability.

Here is an example. Some of you may know that at least in the middle of the country, we had a very late-arriving spring, and planting was delayed. But in another example of resilience, in just one week, 52% of Minnesota’s corn crop was planted. That is the fastest one-week corn planting on record.

Cargill. View from tractor seat. Planting a field.
A farmer plants his crops at night. View from the tractor seat. You can see the monitor, the light bar on the hood and almost no visibility in the dark, yet it is still operating – because the GPS is steering.

Adaptation at work 

Here you can see a photo from one of our farmer customers who is planting at night.  From the tractor seat you can see the monitor, the light bar on the hood and almost no visibility in the dark, yet still operating – because the GPS is steering.  About 75% of our customers can now plant 100% of their corn and soybeans in seven days or less.

And one of our customers in Ohio planted 6,000 acres of corn and soybeans in four and a half days. That is what adaptation, profitability and reinvestment has done on the farm.  

Key driver of production: yield increase 

Over a long period of history, the main contributor to increased food production has been yield gain through genetic improvement and fertilizer use. Acreage has been relatively stable from 1975 to the early 2000s. Only recently have we seen harvested acreage increase.

Farmers have met the challenge of increasing demand for food. But the environmental price of our practices in some cases was too high, and the debt to Mother Earth is now being repaid through a host of remediation efforts such as ag setbacks, and buffer strips to prevent run off; better tillage and reduced tillage practices; restoration of wetlands; and bio-digesters that recover energy from dairy waste – to name just a few.

Cargill. Satellite imagery for determining yield potential and crop inputs.
A satellite image for precision agriculture. Cargill's Next-Field system uses satellite images and soil sampling to determine yield potential and crop inputs.

Precision agriculture

One of the ways farmers are meeting the challenge is through optimization of inputs. Precision Agriculture is the integration of all sorts of optimization tools.

Basic precision agriculture uses satellite images and soil sampling to come up with an average yield potential for a field, and crop inputs like fertilizer are applied based on average yield goals. Cargill’s Next-Field system goes several steps further and develops 2.5-acre yield zones within a field to more precisely apply inputs where they make the most difference, based on that area’s individual yield environment – and in the process reduces waste and environmental impact. We should be impressed that the free market, without any intervention, has incented the conservation of valuable resources and improved sustainability.

Another example of resilient behavior in farmers was featured in a May 20 New York Times story about the Ogallala Aquifer. The aquifer is under depletion stress in High Plains states as a result of intensive farming and drought. The story was loaded with examples of how farmers are adapting to this changing environment: switching to raising dairy heifers, or switching to less water-thirsty crops such as sorghum. Or deciding to rely on rain alone and the lower resulting corn yields.

Farmers also take steps to protect moisture and soil conditions in their fields through conservation tillage and tractors with wide tracks that have a light foot print. They “tip toe” across fields so they don’t cause soil compaction.  Our industrial innovators captured this market opportunity, proving their resilience. 

In addition to farmers, livestock producers are optimizers, too. For example, there has been a huge surge in the addition of enzymes to animal feed, born largely out of rising ingredient prices partially connected to the ethanol boom, which signaled a need to improve feed conversion into meat and milk.

The resilience of governments and policymakers

Let’s move off the farm and into the halls of government. How are governments and policymakers showing resilience?

Fortunately, when it comes to behaviors that distort markets and disincentivize farmers, today governments for the most part are showing more restraint. Unlike the embargoes we saw in the 2008-2009 time period, there have been fewer market- and price-distorting behaviors of late, such as artificially suppressing prices, hoarding supplies or banning imports or exports.

But there are exceptions. We have seen this clearly in the Indonesian beef market, where steps were taken to block live cattle and boxed beef imports in an effort to spur local production. This has resulted in dramatically higher prices for Indonesian consumers and lower supplies on grocery store shelves.

For those of us who believe the economist David Ricardo, we know that self-sufficiency is not the answer. The world will always raise the most food the most economically and in the most environmentally responsible way when farmers plant the right crops for their local climate and soils using the right technology, then trade with others.  If every government set a goal of food self-sufficiency, the world would have much less food.

Resilience in the world’s largest agricultural economy

Another example of resilience is what has been happening in China. Policymakers in the world’s largest agricultural economy have shown their ability to adapt and change behaviors.

China has helped the world in aggregate produce more food through its decision to honor comparative advantage and import soybeans.  When it focuses on those areas where it has an advantage – using its scarce land to produce corn, wheat, rice, which yields relatively better in China, then imports soybeans and vegetable oils, which yield relatively more poorly in China  – the world in total raises more food.

The challenge for China now is that it built its model for agriculture on the assumption of inexpensive and widely available labor, and a multi-cropping environment. As wages rise, urbanization continues, and agricultural land reform evolves, China again will be tested in terms of its resiliency.

Africa’s critical role in feeding the world

Africa is another example where government action and policymaking is so critical to our ability to feed the world.

In short, Africa has the soil and the rainfall -- but not the policy, infrastructure and rule of law -- that are necessary, along with higher non-farm income, for increasing food production. But we see positive changes in the works.

Some of the lowest productivity gains over the last 40 years have existed in much of Africa.  But there are countries on the continent that have shown resilience and their commitment to change the face of agriculture.

As just one example, Nigeria is working hard to transform its agricultural sector, by their own words, “treating agriculture as a business and not a development program.” And treating farmers as business people and not aid recipients. They are interested in attracting private sector investment, and creating the right conditions for both smallholder and large-scale farmers to succeed and collaborate.

Ensuring production through science and innovation

How are policymakers helping ensure the availability of food through their support of science and innovation? I believe the acceptance of science is the foundation to being resilient.

Clearly the world has benefited from the application of food technology and particularly, the appropriate and well-regulated use of genetic engineering to create foodstuffs that are cheaper to produce and require less water, chemicals and tillage.

But the use of that technology is under debate in the United States. In American agriculture, resistance to genetically modified (GM) products was pretty much seen as a European issue, but now it is an American issue. While a ballot initiative in California that called for front-of-package labeling of any GM product was proposed and then defeated, it became clear that there was not sufficient understanding of GM by the public. We need to reach out to both governments and consumers to better explain the benefits of this technology. We need to gain society’s permission to use sound, proven and well-regulated science in the production of food.

The resilience of consumers

While producers and policymakers are changing their behaviors, so are consumers.

I would argue we have seen the resilience of the world’s population as they faced higher food prices in the past six years.  As food prices spiked, the conventional wisdom was that it would be a big challenge for developing countries…and that they would have no defense against the rising cost of food.

But rather, GDP rates in developing economies have continued to climb over that period.  Part of the reason is that as much as 70% to 80% of the population is involved in farming….and higher prices have created an income opportunity for farmers. In fact they have been thriving….because they are producing and selling into this better price environment.

Let me share an astonishing fact with you... Based on our tracking of global incomes, we have seen real, inflation-adjusted GDP of the least affluent 70% of the world's population more than double in the last 10 years.  To me that's incredible resilience during a period so widely lamented.

You may be surprised to hear that we collectively return less than 2% of global GDP to farmers for the calories they bring us in basic foodstuffs.  We can afford to be thoughtful in how we compensate them.

Food is emotional

We also need to acknowledge that food is personal.

In developed countries, people increasingly want to know the "story" behind their food. Just look at the bookstore shelves to see how much we as consumers contemplate our diets. Food is emotional, but we need to address it with hard science in a resource-constrained world. We shouldn’t return to medieval agriculture.  We need a science-driven agenda, not an emotion driven one.

Global production and consumption

So what kind of results does this resiliency produce?

This chart shows that the world’s variability from year-to-year in its annual total crop production versus trend line is no greater than it was 35 years ago.  So despite the headlines that the world has become a far more desperate place and a far more volatile place, actual data on deviations to trend line in global agricultural production is not different from the 1970s. We don’t deny the possibility, or even the likelihood, of looming challenges ahead related to agriculture, but we haven’t seen the impact of climate change in our data so far. Climate change is a risk we cannot throw off frivolously. We need the scientific community -- and many of you in this room -- to better define the risk -- based on science. 

The changing Canadian Prairie provinces

In North America, we are seeing changes not just in how food is produced, but where it is produced.

I grew up in North Dakota, where the big question up for debate in the spring was: would you plant your wheat on May 20 or May 28? Now the question is much more complicated. Your options are not just wheat, but now corn, soy, canola, sunflower and lentils enter the equation. Clearly, the variety of crops being grown in Bottineau County North Dakota is quite different from when I graduated from high school. So what does this example show?

Is the planting of corn a response to an increased number of frost-free days in the Northern latitudes? The answer is yes. At least in this microclimate it seems like something structural is happening. But I would argue that genetics, price and crop insurance are at least as contributory factors as frost-free days. Farmers’ decisions are based on a host of elements.

It is also true that these same factors are driving investments in Canada’s Prairie Provinces of Manitoba, Saskatchewan and Alberta….by Cargill and others, and that Canada’s grain mix is being transformed.

While changes in temperature and moisture may play out differently on different continents, I was encouraged to see the preliminary research that MIT presented at the last Forum on the world’s breadbaskets. I recognize that this work is preliminary but I congratulate MIT for tackling such a highly charged issue. In the absence of climate mitigation policies, the ability to adapt is critical. That is why this research is so important.

The importance of working together 

Agricultural production has always been affected by variability in weather...and farmers have adapted strategies appropriate to their local situation.

Who knows when and how much we will tip the scales from weather volatility to climate change? We all need to look carefully for this future.  We are looking for scientists to help us with these questions, to help define the borders and the scenarios so we can help create a more sustainable food system.

I believe that we have the power to adapt, and that the resilience we have shown in the face of change will continue.

So if there is one point to leave you with, it is the importance of working together on this issue of feeding the world. That is why we are so pleased to see MIT involved in this work.

Cargill is optimistic that we can, in fact, feed our world – even in a changing environment.

NOTE: These are the speaker’s “as-prepared” remarks.

E2e
News Release
MIT News Office

June 17, 2013 E2e
Vicki Ekstrom
MIT Energy Initiative

Energy efficiency promises to cut emissions, reduce dependence on foreign fuel, and mitigate climate change. As such, governments around the world are spending tens of billions of dollars to support energy-efficiency regulations, technologies and policies.

But are these programs realizing their potential? Researchers from the MIT Energy Initiative (MITEI) and University of California at Berkeley’s Haas School of Business have collaborated to find out.

The researchers’ energy-efficiency research project, dubbed “E2e,” is a new interdisciplinary effort that aims to evaluate and improve energy-efficiency policies and technologies. Its goal is to support and conduct rigorous and objective research, communicate the results and give decision-makers the real-world analysis they need to make smart choices.

The E2e Project is a joint initiative of the Energy Institute at Haas and MIT’s Center for Energy and Environmental Policy Research (CEEPR), an affiliate of MITEI — two recognized leaders in energy research.

The project’s name, E2e, captures its mission, the researchers say: to find the best way to go from using a large amount of energy (“E”) to a small amount of energy (“e”), by bringing together a range of experts — from engineers to economists — from MIT and UC Berkeley. This collaboration, the researchers say, uniquely positions the E2e Project to leverage cutting-edge scientific and economic insights on energy efficiency.

“Cutting energy has lots of potential to help us save money and fight climate change,” says Michael Greenstone, MIT’s 3M Professor of Environmental Economics and a member of MITEI’s Energy Council. “It’s critical to find the local, national and global policies with the biggest bang for the buck to use governments’, industry’s and consumers’ money wisely while slowing climate change.” 

Greenstone is leading the project with Christopher Knittel, co-director of CEEPR, and Catherine Wolfram, associate professor and co-director of the Energy Institute at Haas.

“When deciding on the best energy measures to implement, decision-makers should compare model predictions to actual consumer behaviors. That’s where this project comes in,” Wolfram says. “The E2e Project is focused on singling out the best products and approaches by using real experiments centered on real buying habits. It will provide valuable guidance to government and industry leaders, as well as consumers.”

The group’s motivations for studying energy efficiency are derived, in part, from the McKinsey Curve — a cost curve that shows that abating emissions actually pays for itself.

“Our goal is to better understand what the costs and benefits of energy-efficient investments are — where the low-hanging fruit is, as well as how high that fruit is up the tree,” says Knittel, MIT's William Barton Rogers Professor of Energy Economics at the MIT Sloan School of Management. “The McKinsey curve would suggest the fruit’s already on the ground. If this is true, we want to figure out why no one is picking it up.”

Former U.S. Secretary of State George P. Shultz, a member of the E2e advisory board, says, “I like the saying ‘A penny saved is a penny earned,’ which rings true from the standpoint of energy. Energy that is used efficiently not only reduces costs, but is also the cleanest energy around. The E2e Project will allow us to better understand which energy-efficiency programs save the most pennies.”

Shultz is a distinguished fellow at Stanford University’s Hoover Institution, where he leads the Energy Policy Task Force. The board also includes MIT Institute Professor John Deutch, former undersecretary of the Department of Energy; Cass Sunstein, a professor at Harvard Law School and President Obama’s former director of regulatory affairs; Susan Tierney, managing principal at Analysis Group and a former Department of Energy official; and Dan Yates, CEO and founder of Opower.                                                           

The E2e Project seeks to answer questions such as: Are consumers and businesses bypassing profitable opportunities to reduce their energy consumption? What are the most effective ways to encourage individuals and businesses to invest in energy efficiency? Are current energy-efficiency programs providing the most savings?

The project’s first experiments are already underway. For example, the team is tracking consumers’ vehicle purchasing decisions to discover if better information about a car’s fuel economy will influence consumers to buy more fuel-efficient vehicles. If so, emphasizing the calculated fuel savings in the vehicle information presented to consumers may be productive. 

Other initial projects include evaluating the Federal Weatherization Assistance Program, and determining why households invest in energy efficiency and the returns to those investments.

More information: e2e.haas.berkeley.edu or e2e.mit.edu

The E2e Project was funded with a grant from the Alfred P. Sloan Foundation.

energy efficiency
In The News
Washington Post: Wonk Blog

It’s something we hear from policymakers again and again: The world squanders too much energy. And wringing out that waste should be one of the easiest ways for the United States and other countries to save money and curb pollution.

But as it turns out, much of what we know about the topic of energy-efficiency is still fairly hazy. Sure, it’s technically doable to make cars more fuel-efficient or insulate homes to prevent heat from leaking out. But which of these efforts are really the most cost-effective? And if it’s such a no-brainer, why aren’t people already taking these steps?

The fact that we still don’t have great answers to those questions is what inspired a group of economists at MIT and the University of California, Berkeley to launch a big new project, called E2e, that will try to apply more scientific rigor to the whole topic of energy efficiency.

“Almost all of the previous work on energy efficiency comes from engineering studies, which look at what’s possible under ideal conditions,” says Michael Greenstone, an economist at MIT and co-director of the E2e project. “We wanted to ask a slightly different question — what are the actual returns you could expect in the real world?”

Here’s what he means. In 2009, McKinsey & Co. released an eye-popping study demonstrating that the United States could hugely improve the efficiency of its homes, offices and factories, through strategies like sealing leaky building ducts and upgrading old appliances. By doing so, McKinsey estimated, the country could save $680 billion dollars over 10 years and do the climate equivalent of taking all the nation’s cars off the road.

Yet as economists scrutinized those numbers, they realized the picture is more complex. ”Those engineering studies can’t account for the behavioral changes you might see in response to efficiency improvements,” says MIT’s Christopher Knittel, who also co-directs the E2e project. “People could, for instance, start adjusting their thermostat if it becomes cheaper to cool the house.” (This is known as the “rebound effect.”)

Ideally, says Knittel, researchers would start conducting rigorous, randomized controlled trials to find out precisely how effective various efficiency policies are. The E2e  Web site lists some of the detailed work that has been done on this front — though there aren’t many such studies.

One recent study of Mexico, for instance, found that a government program to help people to upgrade their refrigerators with energy-saving models really did curtail electricity use. However, a similar program for air conditioners had the opposite effect — when people got sleeker A/C units, they used them more often, and energy use went up.

“The point is that policymakers aren’t going to spend an infinite amount of money trying to save energy or reduce greenhouse gases,” Greenstone says. “So the motivation is to find the places where the return is the greatest. If you could reduce a ton of carbon-dioxide for $100 or two tons for $50, you’d choose the latter.”

The researchers are also asking why, if it’s so compelling, people and businesses don’t already take steps to become more energy efficient. Is it because people aren’t aware that they can? Are there actual market barriers that could be addressed by policy? (For instance, landlords may have little incentive to invest in energy-saving appliances for their tenants.) Or is it just that the purported savings aren’t worth it in the first place?

“It’s easy to come up with conjectures for why people aren’t choosing more efficient options,” says Catherine Wolfram, an economist at the Energy Institute of Haas in Berkeley. “Maybe people don’t have the right information, maybe people are procrastinating. But right now, these are just stories. It’s an area where we need more evidence.”

Some work is being done on this front. Knittel, for instance, is conducting an experiment to see whether people will buy more fuel-efficient cars if they simply receive more detailed information about gasoline costs and mileage. Greenstone and Wolfram are carrying out a randomized controlled trial to scrutinize a U.S. government program to help weather-proof the homes of low-income people.

“Part of the reason we started this project is that efficiency is one of the few areas where there’s broad agreement across the political spectrum that these are policies we should be pursuing,” Greenstone says. “And we want to be able to show what actually works and what doesn’t.”

nuclear power
News Release
MIT News

June 14, 2013
Alli Gold
MIT Joint Program on the Science and Policy of Global Change 

After the 2011 Fukushima nuclear disaster, energy experts and policymakers around the world began to reassess the future of nuclear power. Countries, including Japan and Germany, have since scaled back or plan to shut down their nuclear power — sparking a global debate on how nations will replace nuclear.

Taiwan is just one country where this intense debate is unfolding. Yen-Heng Henry Chen, a Taiwan native and research scientist at MIT’s Joint Program on the Science and Policy of Global Change, decided to look at how the nation’s economy and emissions reduction strategies might be affected by future changes to Taiwanese nuclear energy policies.

“There has been little research on the interactions between non-nuclear and low-carbon policies,” Chen says. “Taiwan has a small economy and limited natural resources, making it an interesting case study for other countries looking for ways to cut carbon emissions with or without nuclear power.”

The Taiwanese government aims to cut its CO2 emissions in half (from 2000 levels) by 2050. One way they had planned to do this was through nuclear power. Taiwan currently has three nuclear power plants, with plans to bring a fourth plant, the Longmen Nuclear Power Station, online in 2015. This tightly populated country has more than nine million residents within 50 miles of its three existing nuclear reactors. Because Taiwan is similar in topography and fault lines to Japan, the prospect of the new plant — and perhaps others to come — has raised public concerns about the safety of nuclear power.

“After the Fukushima accident, more than 60 percent of the Taiwanese population was against the construction of a new nuclear power plant according to a recent poll,”  Chen says. “I wanted to know what it would mean for the Taiwanese economy and the government’s emissions reduction targets if they were to eliminate or reduce nuclear power.”

Taiwan currently imports 99 percent of its energy, which includes oil, natural gas, coal and nuclear. Because the opportunities for alternative low-carbon energies such as solar, wind and hydro are limited, Chen conducted an economy-wide analysis that explored other ways to reduce carbon emissions: nuclear power, a carbon tax, and carbon capture and storage (CCS) technology.

When implementing a low-carbon and non-nuclear policy, without the availability of CCS (which is not yet cost-effective at a large scale), Chen finds that by 2050 GDP would drop by about 20 percent. If CCS were to become more cost-effective and could be added to the low-carbon strategy, GDP would drop by less than 10 percent. But the least expensive way to pursue a low-carbon policy, Chen finds, would be to expand nuclear capacity in addition to adopting CCS. If nuclear capacity was tripled (compared to current levels) and CCS option was feasible, by 2050 GDP loss would be reduced to around five percent.

Absent nuclear power and CCS, “Taiwan needs to convert its industrial structure into a much less energy intensive one if the country is serious about achieving a low-carbon environment,” Chen says. Taiwan’s industrial sector accounts for almost half of the country’s energy demands.

Costs could be lowered for industry and consumers if Taiwan were able to join an international emissions trading system — which Chen looks forward to exploring further in future research.

Until such an international trading system exists, “This case study can help policymakers better understand the costs of cutting CO2 emissions without nuclear energy,” Chen says, “as nuclear power becomes a less viable energy solution in Taiwan and around the world.”

ethiopia
News Release
MIT News

June 13, 2013
Alli Gold
MIT Joint Program on the Science and Policy of Global Change

If you know how much something costs, you can budget and plan ahead. With this in mind, a team of researchers from MIT, the World Bank and the International Food Policy Research Institute recently developed a country-level method of estimating the impacts of climate change and the costs of adaptation. This new method models sector-wide and economy-wide estimates to help policymakers prepare and plan for the future.

"Previous country-level research assessing climate change impacts and adaptation either focused on economy-wide estimates or sector-by-sector analysis, without looking at the bigger picture," says Kenneth Strzepek, one of the lead authors of the study and a research scientist at MIT's Joint Program on the Science and Policy of Global Change. "By looking at the interplay between different sectors and within the economy, we are able to evaluate the indirect effects and interactions that can occur that are often not captured."

As a case study, the researchers apply their technique to Ethiopia — the second most populated country in Sub-Saharan Africa. They look at three key sectors: agriculture, road infrastructure and hydropower.

"These sectors were selected because of their strategic role in the country's current economic structure and its future development plans," Strzepek says.

Agriculture accounts for about 46 percent of the GDP in Ethiopia and is almost entirely rain-fed. Variability in temperature and rainfall will have major impacts on this crucial industry. The researchers found that with a temperature increase of two degrees Celsius, more intense drought and floods will cause a drop in crop production — triggering reductions in income, employment and investments.

Frequent and intense flooding will also damage Ethiopia's road infrastructure — the backbone of the country's transportation system and a needed link in the agricultural supply chain. The researchers found that flooding brought on by climate change will increase maintenance costs by as much as $14 million per year for the existing road network, which is expected to grow dramatically in the next 40 years.

The intense variability of precipitation will also greatly impact the country's hydropower and associated reservoir storage, which could provide energy, irrigation and flood mitigation. Because there is currently little installed hydro capacity in Ethiopia, the model showed few climate change impacts. But in the coming years, the government plans to invest heavily in this sector, meaning there could potentially be significant impacts to this sector as well.

Additionally, the researchers found that there would be an increased demand for water across sectors and create challenges for policymakers to effectively distribute this important resource. For example, Ethiopia plans to expand irrigated agriculture by 30 percent by 2050. The researchers found that some of the irrigation demands will be unmet, placing demands on other sectors requiring water resources.

"This research makes clear the impact droughts, floods, and other effects brought on by climate change can have on major financial sectors and infrastructure," Strzepek says. "For Ethiopia, we find that one of the best defenses against climate change is investment in infrastructure for transportation, energy and agriculture. By building up these sectors, the government will be able to enhance the country's resiliency."

He continued, "In predicting the outcomes of future water, infrastructure and agriculture projects, we were able to test the effectiveness of policies. This gives decision-makers in these countries, as well as international organizations, the information they need to continue to grow, develop and plan for the future with climate change in mind."

Planning for climate change is essential, Raffaello Cervigni, a co-author of the study and lead environmental economist at the World Bank, writes in a recent blog post.

"Addressing climate change is first and foremost a development priority for Africa … If no action is taken to adapt to climate change, it threatens to dissipate the gains made by many African countries in terms of economic growth and poverty reduction over the past ten years," he writes.

But, he continues, "a harsher climate need not be an impediment for Africa's development," if we can come together to address these challenges.

The integrated approach used by the authors is now being applied to studies on the costs of adapting to climate change in Ghana and Mozambique, as well as Vietnam. Others have replicated the approach to help other countries calculate the costs of adaptation.

Reprint 2013-7

A Slice of MIT

By Joe McGonegal

June 12, 2013 

On the slopes of Mt. Karisimbi, a 4,500-meter volcano in northwestern Rwanda, a lone MIT researcher is working this year to add new data to climate change research.

She is Katherine Potter PhD ’11, the principal investigator for the new Rwanda Climate Observatory. Working in the same area where iconic zoologist Dian Fossey studied mountain gorillas a half-century ago, Potter works just as passionately towards her goal: to empower Rwandans in becoming part of climate change research and to get Africa on the climate-change grid.

If Potter is successful, the observatory atop Mt. Karisimbi will join the Advanced Global Atmospheric Gases Experiment (AGAGE), a worldwide program funded in part by NASA and NOAA that captures climate data.

AGAGE began in 1978 and now includes eight observatories around the world that record air pollution and greenhouse gas emissions. It is a leading source of data for measuring progress against the 1987 Montreal Protocol and 1997 Kyoto Protocol benchmarks for carbon emissions. 

Until now, Africa has not had an observatory feeding into AGAGE’s experiment. Covering a fifth of the world’s land mass, this is no small piece of lost data. 

Potter hopes to fix that. Working for MIT’s Center for Global Change Science, Potter is training future Rwandan scientists, technicians, and academics to collaborate in the world’s efforts to monitor climate change.

Mt. Karisimbi is a perfect place for the observatory, says Potter, who blogs about her progress. “At 4,500 meters, the air reaching the station will come from a large area, getting info from much of Africa and the surrounding oceans,” she says. “Also, it shares a border with Congo and is in the same protected area that continues into Uganda. So this is unifying the East African community in doing climate research.”

Potter’s work is a result of a conversation Rwandan president Paul Kagame began with then president Susan Hockfield in 2008. Kagame was on campus for the Compton Lecture. CGCS director Ronald Prinn ScD ’71 and geophysics professor Maria Zuber have led MIT’s efforts to develop the project since.

The project has inspired other alums, like Jonathan Goldstein ’83. “My wife Kaia Miller Goldstein and I have worked with both the Rwandan and MIT leadership,” Goldstein says. “It has been exciting to see them collaborate on this worthy project.  We were thrilled to meet [Potter] while in Rwanda recently. She is a real star.” 

“I think the real joy for those involved comes from the cultural collaboration, where MIT scientists can really make a difference in the world and the Rwandan people can show the world that they are rapidly advancing as a society,” says Goldstein.

MIT is one of ten universities that participate in AGAGE, a venture jointly funded by British, American, and Australian government agencies. AGAGE instruments around the world measure and report on the atmospheric levels of 33 compounds.

Potter is collaborating with the Ministry of Education in Rwanda, which is recruiting top academics and analysts from within its borders to participate. The Rwandan government is also planning to construct an €18-million cable car up Karisimbi, in the hopes that the station becomes a tourist destination, too. Potter estimates that the observatory will be complete and staffed by Rwandan scientists in the next three or four years.

Lermusiaux
Around Campus
MIT News

"When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.”

A version of this quote, originally penned by Sir Arthur Conan Doyle in “The Case-Book of Sherlock Holmes,” appears in a dog-eared copy of “Advanced Mathematical Methods for Scientists and Engineers” on a shelf in Pierre Lermusiaux’s office. The textbook, which he has kept since he was an engineering undergraduate in Belgium, introduces each chapter with a quote from the fictional sleuth — a literary prompt that pushed Lermusiaux, as a young student, to keep reading.

The quote above is particularly apt for Lermusiaux, who has devoted his research, in part, to eliminating unlikely scenarios in ocean dynamics.

Lermusiaux leads MIT’s Multidisciplinary Simulation, Estimation, and Assimilation Systems (MSEAS) group, which develops models and assimilation schemes to better predict ocean behavior for a wide range of applications — from planning the most efficient paths for underwater robots to anticipating how bioluminescent organisms will affect sonar propagation.

The group focuses, in part, on modeling coastal areas, which Lermusiaux describes as a veritable sea of complexity.

“In coastal areas, things can get more mixed up than in the open ocean,” says Lermusiaux, an associate professor in the Department of Mechanical Engineering. “You have fronts and eddies, currents and jets, and the effects of winds, the seabed and the Earth’s rotation. There is a lot of coastal ocean in the world, and it’s very dynamic."

Working hard for fun

The concept of fluid dynamics was of early interest to Lermusiaux, who remembers learning of the Coriolis effect — the inertial force created by the Earth’s rotation — in a high school geography class.

“The teacher started explaining with an apple, and I still vividly remember that part, and thought it was fascinating how these forces would appear,” he recalls.

Lermusiaux grew up in Liège, Belgium, in a family of scientists. His father is a nuclear engineer, his mother a geography professor, and his sister an architect. The family often went along on his mother’s field trips, and took countless vacation detours to visit natural sites and manmade systems, including old ruins and architectural relics, following the family mantra: “It needs to be seen."

His father comes from a long line of farmers, dating back five generations — a lineage that may have rubbed off on Lermusiaux, who spent many of his weekends and holidays working at a local farm with a friend.

“We’d get up very early in the morning, and they’d do a very good breakfast of eggs and bacon, and you were almost like a son of the family,” Lermusiaux says. “We’d show up, work very hard, and we’d stink by the end of the day. But it didn’t seem like work to us — it was fun.”

When it came time to decide on a path after graduating with an undergraduate degree in mechanical engineering from Liège University, Lermusiaux recalls broaching the subject of graduate studies abroad over the dinner table. Not long after, he headed across the Atlantic to Harvard University to pursue a PhD in engineering science.

Going coastal

For his thesis, Lermusiaux worked to whittle down the uncertainty in ocean modeling. At the time, ocean data were relatively limited, and samples came with some uncertainty. As a result, approximate models initialized using that fuzzy data could lead to widely varying predictions. Lermusiaux looked for ways to characterize and predict uncertainty, and for ways to combine models with multiple data sets to reduce this uncertainty. He developed a data-assimilation method and computational schemes that produced better estimates of, and furthered understanding of, ocean dynamics. His work came at a pivotal time in ocean engineering. 



“It was the end of the Cold War, and people were looking less at the deep ocean, and moving toward the coast,” Lermusiaux says. “It was the beginning of trying to resolve the multiple scales and the motions in the ocean that matter, as realistically as possible.”



During his time at Harvard, Lermusiaux’s work occasionally took him out to sea. On one sampling expedition, he spent three weeks aboard a NATO ship near the Faroe Islands, halfway between Norway and Iceland. The region sits along the Iceland-Faroe Ridge, where warm currents from the Atlantic meet frigid waters from the Nordic seas. The interplay between the two water masses creates extremely powerful fronts that can deflect sonar signals. (The region, in fact, is a setting for the novel “Red October,” in which a Russian submarine evades detection by hiding in the turbulent waters.) Onboard the ship, Lermusiaux analyzed data collected during the cruise and found large-scale wave modes. 



Today, he says, much of this computational engineering work can be done remotely, thanks to the Internet. Researchers can download data directly from cruise servers, and perform analyses on more powerful computers in the lab. 




Eliminating the impossible



Lermusiaux set up his own lab at the end of 2006 when, after receiving his PhD from Harvard, he accepted a faculty position at MIT. Based in the ocean science and engineering section of MIT’s Department of Mechanical Engineering, his group carries out research in mechanics, computations and control. Specifically, his group has developed and applied new methods for multiscale modeling, uncertainty quantification, Bayesian data assimilation and the guidance of autonomous vehicles. 



A specific focus has been to answer questions involving nonlinearities and multiple scales. For example, the team is modeling the dynamic marine environment in Stellwagen Bank, at the mouth of Massachusetts Bay — a rich ecological web of life forms from plankton to whales. Lermusiaux’s group uses mathematical computations to model the relationship between physical and biological processes, aiming to understand how eddies, waves and currents enhance the region’s nutrient delivery and retention. 



The group has also been looking further out to sea to study multiscale dynamics at continental shelf breaks — boundaries at which the shallow ocean floor suddenly drops off, plunging thousands of feet and giving way to much deeper waters. 



“You have fronts between the shelf water and deeper water, and that’s an important region for exchanges,” Lermusiaux explains. “However, the multiscale interactions at shelf breaks are not well understood.” 



Recently, his group has characterized the multiscale variability of internal tides in the Middle Atlantic Bight shelf break. They showed how this internal tide variability can be caused by strong wind and by direct Gulf Stream interactions.



To allow such multiscale studies, Lermusiaux’s team has adapted new ideas in computational fluid dynamics. They are developing numerical models with variable resolutions in time and space, and have also created equations that predict uncertainty in large-scale ocean systems. They then developed nonlinear Bayesian data-assimilation methods that employ these uncertainty predictions. These methods can predict the likelihood of different scenarios and combine these scenarios with actual field observations in a rigorous Bayesian fashion.  



The researchers are also applying their models to the dynamic control and planning of swarms of autonomous underwater vehicles, or AUVs. Increasingly, these robots are used to sample and monitor the ocean for pollution, marine populations, energy applications, and security and naval operations. With his students, Lermusiaux is developing mathematical models to determine the most efficient paths for robots to take, maintaining coordination among robots along the way. For instance, if a current is likely to flow in a certain direction, a robot may want to simply ride the wave toward its destination.  



Lermusiaux’s group is also working on schemes that guide such sensing robots toward locations that provide the most useful undersea data. Similarly, the researchers have recently integrated their work into powerful new systems that can objectively rank competing ocean models, accounting for all uncertainties. 



The key to this kind of modeling, as with much of Lermusiaux’s work, is eliminating unlikely, or impossible, scenarios. For example, determining whether a vehicle should go left or right is a numerical process of elimination, depending on certain parameters like current speed and direction — an oversimplification, compared with the incredibly complex environment which he models.

"We have made advances in numerical schemes, uncertainty prediction, data assimilation and inference, which all have applications in many engineering and scientific fields,” Lermusiaux says. “The smarter you are in combining information with model simulations, the better you can be.”

 

NOAA
Oceans at MIT

The Keeling Curve record from the NOAA-operated Mauna Loa Observatory shows that the atmospheric carbon dioxide concentration hovers around 400 ppm, a level not seen in more than 3 million years when sea levels were as much as 80 feet higher than today. Virtually every media outlet reported the passage of this climate milestone, but we suspect there’s more to the story. Oceans at MIT’s Genevieve Wanucha interviewed Ron Prinn, Professor of Atmospheric Science in MIT’s Department of Earth, Atmospheric and Planetary Sciences. Prinn is the Director of MIT’s Center for Global Change Science (CGCS) and Co-Director of MIT’s Joint Program on the Science and Policy of Global Change (JPSPGC).

Prinn leads the Advanced Global Atmospheric Gases Experiment (AGAGE), an international project that continually measures the rates of change of the air concentrations of 50 trace gases involved in the greenhouse effect. He also works with the Integrated Global System Model, which couples economics, climate physics and chemistry, and land and ocean ecosystems, to estimate uncertainty in climate predictions and analyze proposed climate policies.

What is so significant about this 400-ppm reading?         
Credit: NOAA
This isn’t the first time that the reading of 400 parts per million (ppm) of atmospheric CO2 was obtained. It was recorded at a NOAA’s observatory station in Barrow, Alaska, in May 2012. But the recent 400-ppm reading at Mauna Loa, Hawaii got into the news because that station produced the famous “Keeling Curve,” which is the longest continuous record of CO2 in the world, going back to 1958.

‘400’ is just a round number. It’s more of a symbol than a true threshold of climate doom. The real issue is that CO2 keeps going up and up at about 2.1 ppm a year. Even though there was a global recession in which emissions were lower in most fully-developed countries, China, and to lesser extent India and Indonesia, blew right through and continued to increase their emissions.

Has anything gone unappreciated in the news coverage of this event?

Yes. What’s not appreciated is that there are a whole lot of other greenhouse gases (GHGs) that have fundamentally changed the composition of our atmosphere since pre-industrial times: methane, nitrous oxide, chlorofluorocarbons (CFCs), and hydrofluorocarbons. The screen of your laptop is probably manufactured in Taiwan, Japan, and Eastern China by a process that releases nitrogen trifluoride—release of 1 ton of nitrogen trifluoride is equivalent to 16,800 tons of CO2. But there is a fix to that—the contaminated air in the factory could be incinerated to destroy the nitrogen trifluoride before it’s released into the environment.

Many of these other gases are increasing percentage-wise faster than CO2 . In the Advanced Global Atmospheric Gases Experiment (AGAGE), we continuously measure  over 40 of these other GHGs in real time over the globe. If you convert these other GHGs into their equivalent amounts of CO2 that will have the same effect on climate, and add them to the NOAA measurements of CO2, you find that we are actually at 478 ppm of CO2 equivalents right now. In fact, we passed the 400 ppm back in about 1985. So, 478 not 400 is the real number to watch. That’s the number people should be talking about when it comes to climate change.

What has Advanced Global Atmospheric Gases Experiment (AGAGE) revealed about this greenhouse gas problem?

The non-CO2 GHGs are very powerful. One example is sulfur hexafluoride (SF6), which used to be in Nike shoes, and is now most widely used in the step-down transformers in long-distance electrical power grids. But SF6 leaks a lot, with 1 ton equivalent to 22,800 tons of CO2, and it’s increasing in our measurements. Another example is methane. We have been measuring methane for almost 30 years now, and it actually didn’t increase for almost 8 years from 1998 onwards, but we discovered in our network that it began to increase again in 2006. We published this finding in 2008, and ever since, methane has been rising at a rapid rate. Nitrous oxide, the third most important GHG, has been going up almost linearly since we started measuring it in 1978.

The worrisome thing is that almost all of these gases keep rising and, per ton, they are very powerful drivers of warming. Many of these GHGs have lifetimes of hundreds to thousands to tens of thousands of years, so they are essentially in our atmosphere forever. There is almost nothing practical we can do to vacuum these gases out again. 

Is it possible to decrease the atmospheric CO2?

One well-understood method of removing CO2 from the atmosphere is carbon sequestration, in which you remove the CO2 from the biomass burnt in an electrical utility, and then bury it in subsurface saline aquifers or in the deep ocean. There are people here at MIT, Rob van der Hilst and Brad Hager and others, who study the question of just how permanent is this deep burial on land.

Carbon sequestration can also lower CO2 emissions from coal-fired power plants. It looks like the Department of Energy will reactivate a couple of these projects in Wisconsin and Texas to better understand this technology, with the goal of lowering the emissions from power plants to say 10% or less of what they were.

At the end of the day, the smart thing would be not to resort to vacuuming CO2 out of the atmosphere and putting it down deep underground. It would be better to develop new and affordable zero- or very low-emission energy technologies such as biofuels, nuclear, solar and wind.

Will switching to ‘fracked’ natural gas reduce warming? 

Hydraulic fracturing in a vertical well. Credit: EPA

We have run our Integrated Global System Model presuming that hydraulically fractured gas from shale deposits in the US and elsewhere around the world could begin to be used at large scale. We’ve looked at the question of if we did convert all oil usage to fracked gas usage over the next 20-30 years, would it lower the rate of warming? And the answer is yes, because you get about twice as much energy per ton of CO2 emitted from burning methane as you get from coal.

There are some serious issues about the water used to pump down and split the shale. In the fracking process, trace chemicals are added into the water to make it slippery so the water can force itself in between the layers of shale. The problem is, shale is filled with mucky stuff such as salts and heavy organics, which all ends up in the frack water and comes back up to the surface. So what do you do with that very polluted water? Then there is the concern that the water could travel horizontally and vertically through the shale layers and end up in ground water. And that’s an environmental issue that has to be addressed.

However, chemical companies are already investing in technologies that can take the frack water that’s pumped back out and literally clean out the hydrocarbons and re-use it again for fracking. So, there is an answer to the frack water problem, but there must be a strong push to make sure fracking is environmentally sound.

We did find that if you increase the use of fracked gas and didn’t repair the existing natural gas pipelines, they could leak several percent of the transferred volume because it’s old city and intercity infrastructure. It’s leaking now in all major pipeline systems in the US and Europe, which is a problem because the leaked methane is a much more powerful GHG per ton than CO2. So, repairing or replacing old gas pipelines will be a big requirement.

Addressing all these environmental concerns will add somewhat to the cost of energy. But most who study the climate issue in detail and in depth understand that the damages that are going to result from continued warming will far exceed the cost of any policy that we put together to lower GHG emissions. Yet, as you know, the politics of climate in Washington is impossible right now because a minority of senators can block any legislation. It doesn’t look like anything will happen soon on a national emissions reduction policy. Politics trumps science on these issues. But the EPA has the power to treat CO2 as an air pollutant so maybe that’s what will happen near term.

The bottom line is if we switched from using oil and coal globally to running everything on shale gas, there probably is enough gas there. But with this alone, you would still get about a 3.5 C warming by 2100. With no policy at all, our model estimates a 5-degree or higher warming. So replacing coal and oil with fracked gas is a sensible pathway for the US to go over the next few decades, with the additional advantage of gaining more energy independence. But it won’t remove  the global warming threat beyond that.

What are the implications of the 478-ppm measurement to human life?

According to the paleoclimatological ice core record, if our planet warms more than 2 C globally (4 C at the poles), we are in trouble. That’s about 6 meters or 20 feet of sea level rise. Most of the world’s valuable infrastructure and high populations are along the coasts. So, the damage and cost of sea level rise alone is potentially very high. Other risky phenomena we face are shifting rainfall patterns that may move the locations of arable farmland out of the US and into Canada. Mexico could grow drier and drier, and there’s concern in the Department of Defense about potential challenges to the security at the southern US border. Other similarly vulnerable areas around the world could face desperate large-scale migrations of people seeking to find places to grow food.

These damages are likely to exceed significantly the costs associated with an efficient and fair GHG policy such as an emission tax whose revenues are used to offset income taxes.