News + Media
Competing demands for food, fuels, and forests
How do you value an ecosystem? Putting a dollar value on natural systems such as forests has long beset economists.
Forests provide “non-use values,” such as the pleasure of knowing that a natural system exists, and recreational values, such as hunting, fishing and wildlife viewing. But recently, ecologists have also sought to value a broader set of “ecosystem services,” or the goods and services that ecosystems provide to a market economy.
Ecosystem services related to land include conventional food and forest products, as well as the potential to produce biofuels. But ecosystems also have the ability to store carbon. If a price on carbon were established, an incentive to enhance carbon storage would be created. This new ecosystem service would need to be balanced against conventional food, forestry and biofuels production services. As the number of ecosystem services expand and are fully priced in a market, the demand for land naturally increases.
Researchers from the MIT Joint Program on the Science and Policy of Global Change have used an economic model to explicitly represent recreation value of ecosystems and their carbon storage value. The study examines how demand for ecosystem services will affect land use, food prices and the prospects for biofuels production.
Their study found that growth in demand for biofuels increases when a carbon tax is implemented, leading to increases in CO2 emissions from the conversion of forests to cropland. However, if that carbon tax also includes emissions from land use change, the resulting economic incentive is enough to avoid deforestation. And, if a tradeable credit program to incentivize CO2 sequestration on land is implemented, significant reforestation occurs, such that land use becomes a large net sink for CO2.
This is a surprising result, as land use emissions currently make up about 20 percent of total emissions. But, with carbon taxes and a tradeable credit program, land use would mitigate emissions by storing carbon in forests and replacing fossil fuels with biofuels. In fact, the analysis shows that if carbon storage were credited, land conversion would eventually store as much as one third of the entire global energy emissions over the coming century.
Unfortunately, it’s not that simple — such policies would imply some difficult tradeoffs. In the scenario with full carbon pricing, substantial reforestation and biofuels production occurs, but at the expense of conventional agricultural products. The two new non-food demands for land cause commodity prices to increase, especially impacting livestock prices. The livestock sector is particularly affected because both the rental prices for grazing land and the price of grains used to feed livestock rise. As food prices rise, poor consumers will be considerably affected and may suffer.
“Since conventional agricultural goods are priced in markets, the higher [food] prices projected are efficient in the sense that they reflect the marginal value of storing carbon that would be lost if more land were devoted to food production,” explains John Reilly, co-director of the MIT Joint Program and co-author of the study. He adds, “However, the market values do not take into account equity considerations, and so in the absence of food programs worldwide such higher prices would place a disproportionate burden on lower income people.“
Some of the resulting increase in food prices may be offset by future agricultural technology. But even with such technologies, increasing food prices would still be a substantial departure from the historical trend of falling food prices. As new demands for land stem from an expanded view of ecosystem services, special attention will be needed to counteract the impacts on development and food security.
“It is a dilemma where climate change itself may have negative consequences for food production but extensive reforestation to limit climate change may also squeeze food production by limiting the land available for conventional agriculture.
Thrown on top is a demand for land for biofuels production that could put further pressure on food prices,” Reilly says. “The results are a cautionary tale in embracing efficient market solutions in a world where there are not ready mechanisms to deal with inequitable outcomes.”
MIT researchers improve upon methods to model atmospheric aerosols.
Allison Crimmins
Urban regions account for an ever increasing fraction of Earth’s population, and are consequently an ever increasing source of air pollutants. These pollutants include anthropogenic aerosols, which have important climate and health implications. But modeling aerosol emissions from urban areas is difficult due to the detailed temporal and spatial scales required. Thus, urban areas significantly contribute to the overall uncertainty and variability in global atmospheric model predictions of aerosol and pollutant distribution.
To address these uncertainties, researchers from the MIT Joint Program on the Science and Policy of Global Change set out to see if they could better model aerosol emissions and distribution from urban regions. To accurately model urban areas, factors such as the amount and distribution of emissions, the meteorological and geographical properties of the region, and the chemical and physical processing of emissions over time would need to be considered on spatial and temporal scales much smaller than global models. Previously, modelers have attempted to account for urban aerosol emissions by using a correction factor, which diluted total aerosol emissions across global model grid cells. This dilution method, however, does not capture the heterogeneity of urban and non-urban areas within each grid cell.
Instead, the MIT researchers developed a new detailed air quality model, using meteorological and emissions data from 16 representative urban areas. This urban processing model examined seven different types of aerosols of different sizes and composition, and modeled a total of 251 urban areas, including 91 from China, 36 from India, 50 from developed nations (Australia, Canada, EU, Japan, Singapore, South Korea, US) and 74 from developing nations. The urban processing model was then included into a larger global model that simulates atmospheric chemistry and transport at regional to global scales. Researchers compared the predicted atmospheric aerosol concentrations using this new method with results from the dilution method.
“Not only are we the first group to successfully incorporate an urban-scale chemical processing model into a 3-dimensional global model,” explains Dr. Jason Cohen, the lead author on the report, “but our results resolve important processes which the rest of the modeling community still neglects to include”.
The study found that the urban processing model predicted a lower concentration of atmospheric aerosols than the dilution method, particularly in the Northern Hemisphere and in the summer season. In addition, the urban processing model showed increased concentrations of primary aerosols, like black carbon and organic carbon, and decreased concentrations of secondary aerosols, like sulfates. Thus excluding the urban processing model could lead to an overestimation of some aerosols and an underestimation of others.
The reason these biases exist in the dilution method is that urban areas tend to be more efficient at oxidizing and removing substances like black carbon and organic carbon from the atmosphere— not taking this into consideration leads to an overestimation of the concentration of these species. Because these aerosol species are oxidized, generation of the secondary aerosol species actually increase in urban areas— not taking this into consideration leads to an underestimation of the concentration of those species.
Aerosols tend to cause negative radiative forcing. In other words, they have an overall “cooling effect” on the global climate. But using the urban processing method instead of the dilution method demonstrated an overall smaller concentration of aerosols in the atmosphere. Thus the detailed urban processing model predicts significantly less negative aerosol radiative forcing (less cooling) than the dilution method.
“We are continuing this effort, looking at the long-term climate effects of using detailed urban processing, such as how average surface temperature, precipitation, and cloud cover will be impacted,” says Cohen. “We hope that as we continue to look into the impacts of this new methodology and continue to learn more about the mistakes that the dilution simplification have led to, that others in the climate modeling community will adopt and use our new standard."
See also "Development of a fast, urban chemistry metamodel for inclusion in global models" (PDF)
By JUSTIN GILLIS
The scale of Hurricane Irene, which could cause more extensive damage along the Eastern Seaboard than any storm in decades, is reviving an old question: are hurricanes getting worse because of human-induced climate change?
The short answer from scientists is that they are still trying to figure it out. But many of them do believe that hurricanes will get more intense as the planet warms, and they see large hurricanes like Irene as a harbinger.
While the number of the most intense storms has clearly been rising since the 1970s, researchers have come to differing conclusions about whether that increase can be attributed to human activities.
“On a longer time scale, I think — but not all of my colleagues agree — that the evidence for a connection between Atlantic hurricanes and global climate change is fairly compelling,” said Kerry Emanuel, an expert on the issue at the Massachusetts Institute of Technology.
Among those who disagree is Thomas R. Knutson, a federal researcher at the government’s Geophysical Fluid Dynamics Laboratory in Princeton, N.J. The rising trend of recent decades occurred over too short a period to be sure it was not a consequence of natural variability, he said, and statistics from earlier years are not reliable enough to draw firm conclusions about any long-term trend in hurricane intensities.
“Everyone sort of agrees on this short-term trend, but then the agreement starts to break down when you go back longer-term,” Mr. Knutson said. He argues, essentially, that Dr. Emanuel’s conclusion is premature, though he adds that evidence for a human impact on hurricanes could eventually be established.
While scientists from both camps tend to think hurricanes are likely to intensify, they do not have great confidence in their ability to project the magnitude of that increase.
One climate-change projection, prepared by Mr. Knutson’s group, is that the annual number of the most intense storms will double over the course of the 21st century. But what proportion of those would actually hit land is another murky issue. Scientists say climate change could alter steering currents or other traits of the atmosphere that influence hurricane behavior.
Storms are one of nature’s ways of moving heat around, and high temperatures at the ocean surface tend to feed hurricanes and make them stronger. That appears to be a prime factor in explaining the power of Hurricane Irene, since temperatures in the Atlantic are well above their long-term average for this time of year.
The ocean has been getting warmer for decades, and most climate scientists say it is because greenhouse gases are trapping extra heat. Rising sea-surface temperatures are factored into both Mr. Knutson’s and Dr. Emanuel’s analyses, but they disagree on the effect that warming in remote areas of the tropics will have on Atlantic hurricanes.
Air temperatures are also rising because of greenhouse gases, scientists say. That causes land ice to melt, one of several factors leading to a rise in sea level. That increase, in turn, is making coastlines more vulnerable to damage from the storm surges that can accompany powerful hurricanes.
Overall damage from hurricanes has skyrocketed in recent decades, but most experts agree that is mainly due to excessive development along vulnerable coastlines.
In a statement five years ago, Dr. Emanuel, Mr. Knutson and eight colleagues called this “the main hurricane problem facing the United States,” and they pleaded for a reassessment of policies that subsidize coastal development — a reassessment that has not happened.
“We are optimistic that continued research will eventually resolve much of the current controversy over the effect of climate change on hurricanes,” they wrote at the time. “But the more urgent problem of our lemming-like march to the sea requires immediate and sustained attention.”
By Kerry Emanuel, Special to CNN
August 25, 2011 11:45 p.m. EDT
Editor's note: Kerry Emanuel is a professor of meteorology at the Massachusetts Institute of Technology.

(CNN) -- At this moment, Hurricane Irene poses a risk to almost everyone living along the Eastern Seaboard, from Florida to the Canadian Maritimes. Where will Irene track? Which communities will be affected and how badly? Millions of lives and billions of dollars are at stake in decisions made by forecasters, emergency managers and all of us who live in or own property in harm's way.
It is natural to wonder how good the forecasts are likely to be. To what extent can we trust the National Hurricane Center, local professional forecasters and emergency managers to tell us what will happen and what to do? Undeniably, enormous progress has been made in the skill with which hurricanes and other weather phenomena are predicted. Satellites and reconnaissance aircraft monitor every hurricane that threatens the U.S., collecting invaluable data that are fed into computer models whose capacity to simulate weather is one of the great wonders of modern science and technology.
And the human effort and taxpayer funds that have been invested in this endeavor have paid off handsomely: A three-day hurricane track forecast today is as skillful as a one-day forecast was just 30 years ago. This gives everyone more time to respond to the multiple threats that hurricanes pose.
And yet there are still things we don't know.
For example, we do not know for sure whether Irene will make landfall in the Carolinas, on Long Island, or in New England, or stay far enough offshore to deliver little more than a windy, rainy day to East Coast residents. Nor do we have better than a passing ability to forecast how strong Irene will get. In spite of decades of research and greatly improved observations and computer models, our skill in forecasting hurricane strength is little better than it was decades ago. Why is this so, and how should we go about making decisions in the context of uncertain forecasts?
Since the pioneering work of Edward N. Lorenz in the early 1960s, we have known that weather, including hurricanes, is an example of a chaotic process. Small fluctuations (Lorenz's "butterfly effect") that cannot be detected can quickly amplify and completely change the outcome in just a few days. Lorenz's key insight was that even in principle, one cannot forecast the evolution of some kinds of chaotic systems beyond some time horizon.
In the case of weather, meteorologists think that time horizon is around two weeks or so. Add to this fundamental limitation that we measure the atmosphere imperfectly, sparsely and not often enough, and that our computer models are imperfect, and you arrive at the circumstance that everyone knows from experience: weather forecasts are not completely reliable, and their reliability deteriorates rapidly the further out in time the forecast is made. A forecast for a week from today is dicey at best, and no one even tries to forecast two weeks out. But in the past decade or two, meteorologists have made another important advance of which few outside our profession are aware: We have learned to quantify just how uncertain any given forecast is.
This is significant, because the degree of uncertainty itself varies greatly from one day to the next. On one occasion, we might be able to forecast a blizzard five days out with great confidence; on another, we might have very little faith in tomorrow's forecast.
We estimate the level of confidence in a particular forecast by running many different computer models many times, not just once. Each time we run it, we feed it a slightly different but equally plausible estimate of the current state of the atmosphere, given that our observations are few, far between and imperfect. In each case, we get a different answer; the differences are typically small to begin with but can grow rapidly so that by a week or so, the difference between any two forecasts is as great as the difference between any two arbitrary states of the weather at that time of year. No point in going any further!
But we observe that sometimes and in some places, the differences grow slowly, while at other times and places, they may grow much more rapidly. And by using different computer models, we can take into account our imperfect understanding of the physics of the atmosphere. By these means, we can state with some accuracy how confident we are in any particular forecast for any particular time and place. Today, one of the greatest challenges faced by weather forecasters is how best to convey their estimates of forecast confidence to the public.
Ideally, we would like to be able to say with full scientific backing something like "the odds of hurricane force winds in New York City sometime between Friday and Sunday are 20%." We have far to go to perfect these, but probabilistic statements like this are the best for which we can hope.
We know from experience that everyone will deal with such probabilistic forecasts in their own way: People have a very broad range of risk aversion. But the next time you are inclined to criticize weather forecasters for assigning probabilities to their forecasts, remember this essay and consider how much better off you are than with other types of forecasters you rely on. Your stockbroker, for example. The opinions expressed in this commentary are solely those of Kerry Emanuel.
Nadya Anscombe
If wind power is going to meet 20% of our predicted energy needs in 2100, millions of wind turbines must be installed around the globe. Modelling performed by researchers at Massachusetts Institute of Technology, US, has shown that these vast wind farms, if installed in offshore regions, could reduce the temperature of the lower atmosphere above the site by 1 °C.
This is in contrast to earlier work that found that a land-based deployment of wind turbines large enough to meet one-fifth of predicted world energy needs in 2100 could lead to a significant temperature increase in the lower atmosphere over the installed regions.
Chien Wang and Ronald Prinn say their findings show how important it is that rigorous scientific assessments are made before deployment of large-scale wind farms. "We were surprised by our findings at first but we soon realized that the cooling we predicted is due principally to the enhanced latent heat flux from the sea surface to the lower atmosphere," Wang told environmentalresearchweb.
Wang and Prinn found that the effect varied depending on the location of the wind farm. In tropical and mid-latitude sites, the temperature of the lower atmosphere was reduced by up to 1 °C, whereas even greater reductions were seen in the high-latitude sites.
The consequences of such a temperature change are not clear but the researchers believe that it will have an effect on temperatures, clouds, precipitation and large-scale circulation beyond the installed regions. "However, these non-local impacts or teleconnections are much less significant than we saw in the land cases," said Wang. "This is likely due to the much lower response of the ocean to the imposed surface-drag changes relative to the response of the land to the imposed changes in both surface roughness and displacement height."
Wang and Prinn also examined the issues of intermittency and hence reliability, of large-scale deployment of wind-driven electrical power generation by seasonally averaging the available wind power in various regions of the world. They found that intermittency would be a major issue for a power generation and distribution system that relies on the harvest of wind power from large-scale offshore wind farms.
"Intermittency presents a major challenge for power management, requiring solutions such as on-site energy storage, back-up generation and very long-distance power transmission for any electrical system dominated by offshore wind power," said Wang.
Wang is also keen to point out that the method he and Prinn used to simulate the offshore wind-turbine effect on the lower atmosphere involved simply increasing the ocean surface drag. "While this method is consistent with several detailed fine-scale simulations of wind turbines, it still needs further study to ensure its validity," he said.
Wang and Prinn published their work in Environmental Research Letters (ERL).
The report is available here.
Dr. Tammy Thompson, postdoctoral associate from the MIT Joint Program for the Science and Policy of Global Change, will be speaking at a press conference held by the Texas Sierra Club at the Houston City Hall on Tuesday, July 19, 10:00am. Dr. Thompson wil be speaking on air pollution in Texas.
For the Sierra Club Press Release, click here.
As the world economy attempts to balance burgeoning energy demand with lower carbon emissions, natural gas has become a central pillar of energy strategy in both the developed and developing world. Advances in exploration and production technology have led to newly abundant sources of “unconventional” natural gas in the United States, with implications for policymakers as they look to reduce dependence on imported oil, promote manufacturing and create jobs. Decisions by Germany and Japan to reduce their reliance on nuclear power also have major implications for the global gas market as governments and businesses look for alternative means of power generation. Sustained rapid economic expansion in emerging economies will continue to drive up demand for natural gas, while increased production capacity for liquefied natural gas production in Australasia and the Middle East is already changing the market dynamics of a resource that has historically relied on the politics of pipelines and proximity.
On June 27, the Energy Security Initiative at Brookings hosted the Washington, D.C. launch of the Massachusetts Institute of Technology’s “The Future of Natural Gas,” an examination of the complex and rapidly changing prospects for the global natural gas market, and the role of natural gas in meeting global energy and environmental challenges. Ernest Moniz, director of the MIT Energy Initiative (MITEI) and co-chair of the report, and Melanie Kenderdine, executive director of MITEI, presented their findings on the extent of global natural gas reserves, production costs and potential end uses, as well as the geopolitical and environmental implications of a gas-fueled global economy. Following a presentation of the report’s findings, David Goldwyn, nonresident senior fellow with the Energy Security Initiative, moderated a panel discussion. Gerald Kepes, partner and head of Upstream and Gas at PFC Energy and Phil Sharp, president of Resources for the Future, joined the discussion. After the program, the speakers took audience questions.
At MIT, former Michigan governor touts new bipartisan initiatives to make the U.S. the world’s clean-energy champ.
David L. Chandler, MIT News Office

In a spirited talk at MIT, former Michigan governor Jennifer Granholm presented a plan for a bipartisan initiative that she said could help the United States regain a world leadership role in the creation of new clean-energy technologies — and the thousands of new jobs that those technologies could provide.
Introduced by her “old pal,” Massachusetts Gov. Deval Patrick, and MIT President Susan Hockfield, Granholm spoke at Tuesday’s reception on clean-energy innovation. The event was hosted by the MIT Energy Initiative and the Joint Program on the Science and Policy of Global Change, a program that its co-director, TEPCO Professor of Atmospheric Science Ronald Prinn, described as a “unique collaboration between the natural and social sciences.”
“At MIT, we’re bullish on clean energy,” Hockfield said in her introduction. In fact, she said, “bullish is an understatement. We’re maniacs about it!” She added that she sees the clean-energy domain as a major area in which to rebuild the nation’s economy.
Patrick said his attendance was intended “to celebrate the leadership of MIT” in clean-energy technology. He said the Institute “has gone so far beyond the basic science … to commercialize so many great ideas” in clean energy, and that in today’s climate of volatile oil prices, “all the elements align for moving ourselves rapidly to a clean-energy future.” He added that in Massachusetts, there has been a 60 percent increase in energy-related employment “during the worst economy in living memory.”
Granholm, who now represents the Pew Charitable Trusts’ Clean Energy Program, said other countries have been “much more aggressive” than the United States in pushing for clean energy, while this country has “a patchwork” of state policies and no strong national program to promote such technologies. In searching for what Granholm called “pragmatic energy policies that can get bipartisan support” even in the current highly polarized political debate, her organization has identified four specific policy priorities, she said.
First, “a national renewable energy standard” would call for at least 20 percent of the nation’s energy to come from renewable sources by 2020, she said. Such a policy “sends a market signal” that would help businesses focus on developing needed technologies.
A second priority, she said, is encouraging more energy efficiency in industrial facilities. She pointed to the example of a French company called Veolia Energy, which develops combined heat and power systems that can be up to 90 percent efficient in using natural gas, the cleanest of all fossil fuels, compared to typical fossil-fuel powerplant efficiencies of around 50 percent. Granholm pointed out that so much energy is wasted in U.S. powerplants in the form of heat that “if you could just capture that waste heat, you could power the entire nation of Japan.”
Third, she said, is to push for more electrification of the transportation system — including a 25 percent market share for new electric cars by 2020 — and improved efficiency for non-electric vehicles. That would help spur the growth of companies such as the MIT-spinoff A123 Systems, which is already “hiring hundreds of people” for its new battery factories.
And fourth, she said, is to “increase the amount of money we, as a nation, invest in energy development.” ARPA-E, the U.S. Department of Energy’s agency for investment in innovative energy technology, currently has a budget of $3.8 billion per year. “If we boost that to $16 billion, we could really be on the map” as a major producer of energy systems, she said.
Granholm pointed out that since 2004, there has been a 630 percent increase in private-sector investment in clean energy worldwide. In 2008, the United States was number one in production of clean-energy technology, but by 2009 China had surged ahead, and in 2010 both China and Germany were ahead of the United States. “Every day, businesses make decisions about where to locate,” and without a strong clean-energy policy, the country’s competitive position “will continue to ratchet down,” she said.
While some people worry that implementing any national policy on clean energy may be difficult right now given the polarized atmosphere in Washington, Granholm said, a recent national survey gives reason for hope. “Eighty-four percent of Americans want to see a national energy policy that encourages renewable energy and efficiency,” a number that includes 74 percent of Republicans, and even a majority of Tea Party members, she said.
Patrick said fostering clean-energy technologies “is good for us, it’s good for the environment, it’s good for the economy, it’s good for jobs. So let’s get on with it!”
Moderator: Ronald G. Prinn
MIT President Susan Hockfield
Governor Deval Patrick
Governor Jennifer Granholm