News + Media
Analysis shows that, contrary to some claims, proposed legislation to limit carbon emissions would not disadvantage those with lower incomes.
So-called “cap and trade” legislation has often been portrayed as a regressive policy — one that would hit poor people the hardest. A new MIT study concluded that this is not the case.
The U.S. House of Representatives passed a cap-and-trade bill last year, and different versions of that bill had been working their way through the Senate until being yanked from consideration last month.
The study, co-authored by researchers at the MIT Joint Program on the Science and Policy of Global Change and at Tufts University, found that under all three versions of the bill submitted so far, the costs would fall hardest on wealthier households, and that lower-income households would see no change or a net benefit. (more...)
Study finds significant potential to displace coal, reducing greenhouse gas emissions 
Natural gas will play a leading role in reducing greenhouse-gas emissions over the next several decades, largely by replacing older, inefficient coal plants with highly efficient combined-cycle gas generation. That’s the conclusion reached by a comprehensive study of the future of natural gas conducted by an MIT study group comprised of 30 MIT faculty members, researchers, and graduate students. The findings, summarized in an 83-page report, were presented to lawmakers and senior administration officials this week in Washington.
The two-year study, managed by the MIT Energy Initiative (MITEI), examined the scale of U.S. natural gas reserves and the potential of this fuel to reduce greenhouse-gas emissions. Based on the work of the multidisciplinary team, with advice from a board of 16 leaders from industry, government and environmental groups, the report examines the future of natural gas through 2050 from the perspectives of technology, economics, politics, national security and the environment.
The report includes a set of specific proposals for legislative and regulatory policies, as well as recommendations for actions that the energy industry can pursue on its own, to maximize the fuel’s impact on mitigating greenhouse gas. The study also examined ways to control the environmental impacts that could result from a significant expansion in the production and use of natural gas — especially in electric power production.
(More…)
Tribute to MIT Joint Program Co-Founder and Co-Director Henry D. "Jake" Jacoby.
Speakers: Ronald Prinn, William Pounds, Mort Webster, Brian Flannery, Paul Eckbo, Arlie Sterling, Ian Sue Wing, Martin Zimmerman, and John Reilly
As U.N. negotiations begin this week on a global mercury treaty, an MIT atmospheric scientist explains the challenges ahead.
The first United Nations negotiating session for a global, legally binding mercury treaty begins today in Stockholm. Continuing through Friday, this is the first of five planned negotiating sessions that will address global controls on mercury, a toxin that causes neurological damage and impairs brain development in infants and children around the world. The sessions are expected to result in a global treaty to be signed in late 2013 that will address the emissions and use of mercury in products, wastes and international trade. Noelle Selin, an assistant professor of engineering systems in MITs Engineering Systems Division, with a joint appointment in atmospheric chemistry in the Department of Earth, Atmospheric and Planetary Sciences, studies the interactions between science and policy in international environmental negotiations. She sat down with MIT News to discuss the first negotiating session, and what she considers to be the biggest hurdles to signing a global treaty, which is "not a given" for the U.S.
Q. What do you see as the biggest challenge in the effort to reduce mercury emissions worldwide? 
A. I see two major intersecting challenges: addressing the global spread of mercury emissions from coal-fired power plants in the context of the increasing demand for energy, and dealing with local impacts of mercury contamination.
The single largest source of anthropogenic mercury emissions is power generation, particularly from coal-fired power plants. A growing, worldwide demand for energy is increasing the use of coal, and this trend will lead to more mercury emissions if not controlled. About half of current anthropogenic emissions come from Asia, mostly from China, which is dramatically increasing its use of coal. Much of the coal used in China is also relatively high in mercury content. Recent research shows that future emissions of mercury to the atmosphere significantly depend most on how energy-based industrial development proceeds in Asia.
Dealing simultaneously with both local issues and long-range transport of mercury will also be a critical challenge for an international agreement. Mercury emitted in elemental form travels worldwide. At the same time, some other forms of emitted mercury deposit close to emission sources. Local impact also comes from the use of mercury in processes and products. Mercury is used extensively in artisanal gold mining in developing countries. Workers and local communities are exposed to some of the highest levels of mercury contamination in the world. Mercury also continues to be used in products, such as thermometers, thermostats, fluorescent light bulbs and a wide range of electronic equipment, including computer monitors and cell phones. Disposal of these products, particularly electronic waste (e-waste) in developing countries, can expose local populations to mercury.
Q. Even if an international treaty is passed, how will it be implemented or enforced? 
A. In general, implementation and enforcement of international environmental agreements are difficult. Some countries simply do not have the intention or political will to meet their obligations. Furthermore, many developing countries lack the financial resources and technical capacity to effectively implement international environmental regulations. For this reason, some environmental agreements include mechanisms for capacity building, as well as the provision of financial assistance. However, this is often one of the most contentious topics of negotiation, and the availability of necessary resources for implementation are often limited as many developing countries argue that industrialized countries do not provide enough support for capacity building.
Another implementation challenge will be coordinating an international mercury treaty with other environmental agreements that already partly cover mercury and other hazardous substances. The Basel Convention on the Control of Transboundary Movements of Hazardous Wastes and Their Disposal controls the international trade and management of hazardous waste including waste containing mercury. The Rotterdam Convention on the Prior Informed Consent Procedure for Certain Hazardous Chemicals and Pesticides in International Trade sets out provisions for the import and export of hazardous chemicals, including mercury. Coordination with these two agreements will be important in addressing the entire life cycle of mercury, including mining and production, use, emission, disposal and cleanup.
Q. How would an international treaty affect developed countries like the U.S. that already regulate mercury emissions? How do current laws in the U.S. regarding mercury emissions and use compare to other industrialized nations? 
A. The U.S. regulates mercury emissions from municipal-waste combustion and medical-waste incineration, but does not currently regulate mercury emissions from coal-fired power plants, which are the largest domestic mercury emission source. This is an area where U.S. regulations should be strengthened; the EPA is currently developing power-plant emissions standards for mercury.
European countries also have stronger regulations than the U.S. on mercury in many products, including a large number of common electronic goods. Sweden, for example, has banned mercury in almost all products, but there are some exceptions, including the use of mercury in compact fluorescent light bulbs. In the U.S., many efforts to phase out mercury in products are voluntary, although some states have more stringent regulations. In fact, California has largely copied European Union regulation on mercury and other hazardous substances in electronics, going beyond federal requirements.
For the U.S., any treaty ratification requires the advice and consent of the Senate, and must be approved by two-thirds of all senators. Over the past few decades, this has been an obstacle for U.S. participation in many multilateral environmental agreements. As a result, the U.S. has not ratified several important environmental treaties, including the Basel and Rotterdam conventions. Domestic politics is likely to be a continuing challenge for U.S. implementation of environmental regulations and international cooperation on mercury, and it is not a given that the U.S. would become a party to a mercury treaty.
[More... ]
If we double the Earth’s greenhouse gases, how much will the temperature change? That’s what this number tells you. (This is the second part of an "Explained" on climate change. Part one dealt with radiative forcing.)
Climate sensitivity is the term used by the Intergovernmental Panel on Climate Change (IPCC) to express the relationship between the human-caused emissions that add to the Earth's greenhouse effect — carbon dioxide and a variety of other greenhouse gases — and the temperature changes that will result from these emissions.
Specifically, the term is defined as how much the average global surface temperature will increase if there is a doubling of greenhouse gases (expressed as carbon dioxide equivalents) in the air, once the planet has had a chance to settle into a new equilibrium after the increase occurs. In other words, it’s a direct measure of how the Earth’s climate will respond to that doubling.
That value, according to the most recent IPCC report, is 3 degrees Celsius, with a range of uncertainty from 2 to 4.5 degrees. This sensitivity depends primarily on all the different feedback effects, both positive and negative, that either amplify or diminish the greenhouse effect. There are three primary feedback effects — clouds, sea ice and water vapor; these, combined with other feedback effects, produce the greatest uncertainties in predicting the planet’s future climate.
With no feedback effects at all, the change would be just 1 degree Celsius, climate scientists agree. Virtually all of the controversies over climate science hinge on just how strong the various feedbacks may be — and on whether scientists may have failed to account for some of them.
Clouds are a good example. Clouds can have either a positive or negative feedback effect, depending on their altitude and the size of their water droplets. Overall, most scientists expect this net effect to be positive, but there are large uncertainties.
"There is still lots of uncertainty in what the climate sensitivity is," says Andrei Sokolov, a research scientist in MIT's Center for Global Change Science, who has been doing research on climate sensitivity for many years. "Feedback is what's driving things," he says.
It is important to note that climate sensitivity is figured on the basis of an overall doubling, compared to pre-industrial levels, of carbon dioxide and other greenhouse gases. But the temperature change given by this definition of climate sensitivity is only part of the story. The actual increase might be greater in the long run because greenhouse gas levels in the atmosphere could more than double without strong policies to control emissions. But in the short run, the actual warming could be less than suggested by the climate sensitivity, since due to the thermal inertia of the ocean, it may take some time after a doubling of the concentration is reached before the climate reaches a new equilibrium.
[More... ]
A curriculum built around a rotating-tank experiment could improve weather and climate education
In recent years, U.S. undergraduates have shown an increasing interest in introductory meteorology, oceanography and climate classes. But many students find it difficult to grasp the non-intuitive nature of rotating fluids, which is critical to understanding how weather systems and climate work. Part of the problem, it turns out, is that instructors usually have to teach these abstract concepts using only equations or computer simulations because of the limited resources available for lab experiments.
That may be about to change, thanks to the work of two educators from the Department of Earth, Atmospheric and Planetary Sciences. For nearly a decade, Lodovica Illari, an EAPS senior lecturer, and John Marshall, professor of atmospheric and oceanic sciences, have been developing an undergraduate weather and climate curriculum that's now being adopted by dozens of schools — and could have a wide impact on science education at many levels.
Known as "Weather in a Tank," the experiment-based curriculum was designed by Illari and Marshall in 2001 after they began offering an introductory weather and climate class that would also fulfill their students' lab requirements.
Since 2006, the curriculum has been tested in a project funded by the National Science Foundation (NSF), which involves MIT and five other universities. The intent was to bridge the gap between real-world weather phenomena and the theories and equations that describe those phenomena. Illari says that we should think of lab experiments as the third leg of a three-legged pedagogical stool that includes observation and theory.
[More... ]
MIT analysis suggests generating electricity from large-scale wind farms could influence climate — and not necessarily in the desired way
Wind power has emerged as a viable renewable energy source in recent years — one that proponents say could lessen the threat of global warming. Although the American Wind Energy Association estimates that only about 2 percent of U.S. electricity is currently generated from wind turbines, the U.S. Department of Energy has said that wind power could account for a fifth of the nation’s electricity supply by 2030.
But a new MIT analysis may serve to temper enthusiasm about wind power, at least at very large scales. Ron Prinn,TEPCO Professor of Atmospheric Science, and principal research scientist Chien Wang of the Department of Earth, Atmospheric and Planetary Sciences, used a climate model to analyze the effects of millions of wind turbines that would need to be installed across vast stretches of land and ocean to generate wind power on a global scale. Such a massive deployment could indeed impact the climate, they found, though not necessarily with the desired outcome.
In a paper published online Feb. 22 in Atmospheric Chemistry and Physics, Wang and Prinn suggest that using wind turbines to meet 10 percent of global energy demand in 2100 could cause temperatures to rise by one degree Celsius in the regions on land where the wind farms are installed, including a smaller increase in areas beyond those regions. Their analysis indicates the opposite result for wind turbines installed in water: a drop in temperatures by one degree Celsius over those regions. The researchers also suggest that the intermittency of wind power could require significant and costly backup options, such as natural gas-fired power plants.
Prinn cautioned against interpreting the study as an argument against wind power, urging that it be used to guide future research that explores the downsides of large-scale wind power before significant resources are invested to build vast wind farms. "We're not pessimistic about wind," he said. "We haven't absolutely proven this effect, and we'd rather see that people do further research."
Daniel Kirk-Davidoff, a chief scientist for MDA Federal Inc., which develops remote sensing technologies, and adjunct professor of meteorology at the University of Maryland, has examined the climate impacts of large-scale wind farms in previous studies. To him, the most promising result of the MIT analysis is that it indicates that the large-scale installation of wind turbines doesn't appear to slow wind flow so much that it would be impossible to generate a desirable amount of energy. "When you put the wind turbines in, they are generating the kind of power you'd hope for,” he said. Tapping the wind resource Previous studies have predicted that annual world energy demand will increase from 14 terawatts (trillion watts) in 2002 to 44 terawatts by 2100. In their analysis, Prinn and Wang focus on the impact of using wind turbines to generate five terawatts of electric power.
[More... ]
When there's more energy radiating down on the planet than there is radiating back out to space, something’s going to have to heat up. (This is the first of a two-part "Explained" on the scientific concepts underlying the concept of the greenhouse effect and global climate change. Part two deals with climate sensitivity.) 
When people talk about global warming or the greenhouse effect, the main underlying scientific concept that describes the process is radiative forcing. And despite all the recent controversy over leaked emails and charges of poorly sourced references in the last Intergovernmental Panel on Climate Change report, the basic concept of radiative forcing is one on which scientists — whatever their views on global warming or the IPCC — all seem to agree. Disagreements come into play in determining the actual value of that number.
The concept of radiative forcing is fairly straightforward. Energy is constantly flowing into the atmosphere in the form of sunlight that always shines on half of the Earth's surface. Some of this sunlight (about 30 percent) is reflected back to space and the rest is absorbed by the planet. And like any warm object sitting in cold surroundings — and space is a very cold place — some energy is always radiating back out into space as invisible infrared light. Subtract the energy flowing out from the energy flowing in, and if the number is anything other than zero, there has to be some warming (or cooling, if the number is negative) going on.
It's as if you have a kettle full of water, which is at room temperature. That means everything is at equilibrium, and nothing will change except as small random variations. But light a fire under that kettle, and suddenly there will be more energy flowing into that water than radiating out, and the water is going to start getting hotter.
In short, radiative forcing is a direct measure of the amount that the Earth’s energy budget is out of balance.
While the concept is simple, the analysis required to figure out the actual value of this number for the Earth right now is much more complicated and difficult. Many different factors have an effect on this balancing act, and each has its own level of uncertainty and its own difficulties in being precisely measured. And the individual contributions to radiative forcing cannot simply be added together to get the total, because some of the factors overlap — for example, some different greenhouse gases absorb and emit at the same infrared wavelengths of radiation, so their combined warming effect is less than the sum of their individual effects.
In its most recent report in 2007, the IPCC produced the most comprehensive estimate to date of the overall radiative forcing affecting the Earth today. Ronald Prinn, the TEPCO Professor of Atmospheric Science and director of MIT's Center for Global Change Science, was one of the lead authors of that chapter of the IPCC's Fourth Assessment Report. Radiative forcing "was very small in the past, when global average temperatures were not rising or falling substantially," he explains.
[More... ]
Intense hurricane activity millions of years ago may have caused and sustained warmer climate conditions, new research suggests
A question central to research on global warming is how warmer temperatures caused by increased greenhouse gases could influence climate. Probing the past for clues about this potential effect, MIT and Yale climate scientists examined the Pliocene period, which began five million years ago and which some consider to be a potential analog to modern greenhouse conditions. They found that hurricanes influenced by weakened atmospheric circulation — possibly related to high levels of carbon dioxide — contributed to very warm temperatures in the Pacific Ocean, which in turn led to more frequent and intense hurricanes. The research indicates that Earth's climate may have multiple states based on this feedback cycle, meaning that the climate could change qualitatively in response to the effects of global warming.
Although scientists know that the early Pliocene had carbon dioxide concentrations similar to those of today, it has remained a mystery what caused the high levels of greenhouse gas and how the Pliocene's warm conditions, including an extensive warm pool in the Pacific Ocean and temperatures that were roughly 4 degrees C higher than today's, were maintained.
In a paper published Feb. 25 in Nature, Kerry Emanuel, the Breene M. Kerr Professor of Atmospheric Science in the Department of Earth, Atmospheric and Planetary Science, and two colleagues from Yale University's Department of Geology and Geophysics suggest that a positive feedback between tropical cyclones — commonly called hurricanes and typhoons — and the circulation in the Pacific could have been the mechanism that enabled the Pliocene's warm climate.
The Pliocene ended around three million years ago with the onset of large ice sheets in the Northern Hemisphere. There has been a slow reduction in carbon dioxide levels in the atmosphere for about 15 million years, and it is thought that the start of the glacial cycles was the climate's response once those levels reached a certain threshold, according to co-author Chris Brierley. While that level remains unknown, this research indicates that by increasing carbon dioxide levels, humans could reach the threshold that would induce a Pliocene-like climate.
More... 
Moderator: Ernest J. Moniz
Robert N. Stavins
Michael Greenstone
Stephen Ansolabehere
Edward S. Steinfeld
Henry D. Jacoby
John Sterman PhD '82
 
		 
 
 
 
 
 
 
