News + Media

Joint Program Logo
3 Questions
MIT News

As U.N. negotiations begin this week on a global mercury treaty, an MIT atmospheric scientist explains the challenges ahead.

The first United Nations negotiating session for a global, legally binding mercury treaty begins today in Stockholm. Continuing through Friday, this is the first of five planned negotiating sessions that will address global controls on mercury, a toxin that causes neurological damage and impairs brain development in infants and children around the world. The sessions are expected to result in a global treaty to be signed in late 2013 that will address the emissions and use of mercury in products, wastes and international trade. Noelle Selin, an assistant professor of engineering systems in MITs Engineering Systems Division, with a joint appointment in atmospheric chemistry in the Department of Earth, Atmospheric and Planetary Sciences, studies the interactions between science and policy in international environmental negotiations. She sat down with MIT News to discuss the first negotiating session, and what she considers to be the biggest hurdles to signing a global treaty, which is "not a given" for the U.S.

Q. What do you see as the biggest challenge in the effort to reduce mercury emissions worldwide?

A. I see two major intersecting challenges: addressing the global spread of mercury emissions from coal-fired power plants in the context of the increasing demand for energy, and dealing with local impacts of mercury contamination.

The single largest source of anthropogenic mercury emissions is power generation, particularly from coal-fired power plants. A growing, worldwide demand for energy is increasing the use of coal, and this trend will lead to more mercury emissions if not controlled. About half of current anthropogenic emissions come from Asia, mostly from China, which is dramatically increasing its use of coal. Much of the coal used in China is also relatively high in mercury content. Recent research shows that future emissions of mercury to the atmosphere significantly depend most on how energy-based industrial development proceeds in Asia.

Dealing simultaneously with both local issues and long-range transport of mercury will also be a critical challenge for an international agreement. Mercury emitted in elemental form travels worldwide. At the same time, some other forms of emitted mercury deposit close to emission sources. Local impact also comes from the use of mercury in processes and products. Mercury is used extensively in artisanal gold mining in developing countries. Workers and local communities are exposed to some of the highest levels of mercury contamination in the world. Mercury also continues to be used in products, such as thermometers, thermostats, fluorescent light bulbs and a wide range of electronic equipment, including computer monitors and cell phones. Disposal of these products, particularly electronic waste (e-waste) in developing countries, can expose local populations to mercury.

Q. Even if an international treaty is passed, how will it be implemented or enforced?

A. In general, implementation and enforcement of international environmental agreements are difficult. Some countries simply do not have the intention or political will to meet their obligations. Furthermore, many developing countries lack the financial resources and technical capacity to effectively implement international environmental regulations. For this reason, some environmental agreements include mechanisms for capacity building, as well as the provision of financial assistance. However, this is often one of the most contentious topics of negotiation, and the availability of necessary resources for implementation are often limited as many developing countries argue that industrialized countries do not provide enough support for capacity building.

Another implementation challenge will be coordinating an international mercury treaty with other environmental agreements that already partly cover mercury and other hazardous substances. The Basel Convention on the Control of Transboundary Movements of Hazardous Wastes and Their Disposal controls the international trade and management of hazardous waste including waste containing mercury. The Rotterdam Convention on the Prior Informed Consent Procedure for Certain Hazardous Chemicals and Pesticides in International Trade sets out provisions for the import and export of hazardous chemicals, including mercury. Coordination with these two agreements will be important in addressing the entire life cycle of mercury, including mining and production, use, emission, disposal and cleanup.

Q. How would an international treaty affect developed countries like the U.S. that already regulate mercury emissions? How do current laws in the U.S. regarding mercury emissions and use compare to other industrialized nations?

A. The U.S. regulates mercury emissions from municipal-waste combustion and medical-waste incineration, but does not currently regulate mercury emissions from coal-fired power plants, which are the largest domestic mercury emission source. This is an area where U.S. regulations should be strengthened; the EPA is currently developing power-plant emissions standards for mercury.

European countries also have stronger regulations than the U.S. on mercury in many products, including a large number of common electronic goods. Sweden, for example, has banned mercury in almost all products, but there are some exceptions, including the use of mercury in compact fluorescent light bulbs. In the U.S., many efforts to phase out mercury in products are voluntary, although some states have more stringent regulations. In fact, California has largely copied European Union regulation on mercury and other hazardous substances in electronics, going beyond federal requirements.

For the U.S., any treaty ratification requires the advice and consent of the Senate, and must be approved by two-thirds of all senators. Over the past few decades, this has been an obstacle for U.S. participation in many multilateral environmental agreements. As a result, the U.S. has not ratified several important environmental treaties, including the Basel and Rotterdam conventions. Domestic politics is likely to be a continuing challenge for U.S. implementation of environmental regulations and international cooperation on mercury, and it is not a given that the U.S. would become a party to a mercury treaty.

[More... ]

Joint Program Logo
MIT News

If we double the Earth’s greenhouse gases, how much will the temperature change? That’s what this number tells you. (This is the second part of an "Explained" on climate change. Part one dealt with radiative forcing.)

Climate sensitivity is the term used by the Intergovernmental Panel on Climate Change (IPCC) to express the relationship between the human-caused emissions that add to the Earth's greenhouse effect — carbon dioxide and a variety of other greenhouse gases — and the temperature changes that will result from these emissions.

Specifically, the term is defined as how much the average global surface temperature will increase if there is a doubling of greenhouse gases (expressed as carbon dioxide equivalents) in the air, once the planet has had a chance to settle into a new equilibrium after the increase occurs. In other words, it’s a direct measure of how the Earth’s climate will respond to that doubling.

That value, according to the most recent IPCC report, is 3 degrees Celsius, with a range of uncertainty from 2 to 4.5 degrees. This sensitivity depends primarily on all the different feedback effects, both positive and negative, that either amplify or diminish the greenhouse effect. There are three primary feedback effects — clouds, sea ice and water vapor; these, combined with other feedback effects, produce the greatest uncertainties in predicting the planet’s future climate.

With no feedback effects at all, the change would be just 1 degree Celsius, climate scientists agree. Virtually all of the controversies over climate science hinge on just how strong the various feedbacks may be — and on whether scientists may have failed to account for some of them.

Clouds are a good example. Clouds can have either a positive or negative feedback effect, depending on their altitude and the size of their water droplets. Overall, most scientists expect this net effect to be positive, but there are large uncertainties.

"There is still lots of uncertainty in what the climate sensitivity is," says Andrei Sokolov, a research scientist in MIT's Center for Global Change Science, who has been doing research on climate sensitivity for many years. "Feedback is what's driving things," he says.

It is important to note that climate sensitivity is figured on the basis of an overall doubling, compared to pre-industrial levels, of carbon dioxide and other greenhouse gases. But the temperature change given by this definition of climate sensitivity is only part of the story. The actual increase might be greater in the long run because greenhouse gas levels in the atmosphere could more than double without strong policies to control emissions. But in the short run, the actual warming could be less than suggested by the climate sensitivity, since due to the thermal inertia of the ocean, it may take some time after a doubling of the concentration is reached before the climate reaches a new equilibrium.

[More... ]

Joint Program Logo
MIT News

A curriculum built around a rotating-tank experiment could improve weather and climate education

In recent years, U.S. undergraduates have shown an increasing interest in introductory meteorology, oceanography and climate classes. But many students find it difficult to grasp the non-intuitive nature of rotating fluids, which is critical to understanding how weather systems and climate work. Part of the problem, it turns out, is that instructors usually have to teach these abstract concepts using only equations or computer simulations because of the limited resources available for lab experiments.

That may be about to change, thanks to the work of two educators from the Department of Earth, Atmospheric and Planetary Sciences. For nearly a decade, Lodovica Illari, an EAPS senior lecturer, and John Marshall, professor of atmospheric and oceanic sciences, have been developing an undergraduate weather and climate curriculum that's now being adopted by dozens of schools — and could have a wide impact on science education at many levels.

Known as "Weather in a Tank," the experiment-based curriculum was designed by Illari and Marshall in 2001 after they began offering an introductory weather and climate class that would also fulfill their students' lab requirements.

Since 2006, the curriculum has been tested in a project funded by the National Science Foundation (NSF), which involves MIT and five other universities. The intent was to bridge the gap between real-world weather phenomena and the theories and equations that describe those phenomena. Illari says that we should think of lab experiments as the third leg of a three-legged pedagogical stool that includes observation and theory.

[More... ]

Joint Program Logo
MIT News

MIT analysis suggests generating electricity from large-scale wind farms could influence climate — and not necessarily in the desired way

Wind power has emerged as a viable renewable energy source in recent years — one that proponents say could lessen the threat of global warming. Although the American Wind Energy Association estimates that only about 2 percent of U.S. electricity is currently generated from wind turbines, the U.S. Department of Energy has said that wind power could account for a fifth of the nation’s electricity supply by 2030.

But a new MIT analysis may serve to temper enthusiasm about wind power, at least at very large scales. Ron Prinn,TEPCO Professor of Atmospheric Science, and principal research scientist Chien Wang of the Department of Earth, Atmospheric and Planetary Sciences, used a climate model to analyze the effects of millions of wind turbines that would need to be installed across vast stretches of land and ocean to generate wind power on a global scale. Such a massive deployment could indeed impact the climate, they found, though not necessarily with the desired outcome.

In a paper published online Feb. 22 in Atmospheric Chemistry and Physics, Wang and Prinn suggest that using wind turbines to meet 10 percent of global energy demand in 2100 could cause temperatures to rise by one degree Celsius in the regions on land where the wind farms are installed, including a smaller increase in areas beyond those regions. Their analysis indicates the opposite result for wind turbines installed in water: a drop in temperatures by one degree Celsius over those regions. The researchers also suggest that the intermittency of wind power could require significant and costly backup options, such as natural gas-fired power plants.

Prinn cautioned against interpreting the study as an argument against wind power, urging that it be used to guide future research that explores the downsides of large-scale wind power before significant resources are invested to build vast wind farms. "We're not pessimistic about wind," he said. "We haven't absolutely proven this effect, and we'd rather see that people do further research."

Daniel Kirk-Davidoff, a chief scientist for MDA Federal Inc., which develops remote sensing technologies, and adjunct professor of meteorology at the University of Maryland, has examined the climate impacts of large-scale wind farms in previous studies. To him, the most promising result of the MIT analysis is that it indicates that the large-scale installation of wind turbines doesn't appear to slow wind flow so much that it would be impossible to generate a desirable amount of energy. "When you put the wind turbines in, they are generating the kind of power you'd hope for,” he said. Tapping the wind resource Previous studies have predicted that annual world energy demand will increase from 14 terawatts (trillion watts) in 2002 to 44 terawatts by 2100. In their analysis, Prinn and Wang focus on the impact of using wind turbines to generate five terawatts of electric power.

[More... ]

Joint Program Logo
MIT News

When there's more energy radiating down on the planet than there is radiating back out to space, something’s going to have to heat up. (This is the first of a two-part "Explained" on the scientific concepts underlying the concept of the greenhouse effect and global climate change. Part two deals with climate sensitivity.)

When people talk about global warming or the greenhouse effect, the main underlying scientific concept that describes the process is radiative forcing. And despite all the recent controversy over leaked emails and charges of poorly sourced references in the last Intergovernmental Panel on Climate Change report, the basic concept of radiative forcing is one on which scientists — whatever their views on global warming or the IPCC — all seem to agree. Disagreements come into play in determining the actual value of that number.

The concept of radiative forcing is fairly straightforward. Energy is constantly flowing into the atmosphere in the form of sunlight that always shines on half of the Earth's surface. Some of this sunlight (about 30 percent) is reflected back to space and the rest is absorbed by the planet. And like any warm object sitting in cold surroundings — and space is a very cold place — some energy is always radiating back out into space as invisible infrared light. Subtract the energy flowing out from the energy flowing in, and if the number is anything other than zero, there has to be some warming (or cooling, if the number is negative) going on.

It's as if you have a kettle full of water, which is at room temperature. That means everything is at equilibrium, and nothing will change except as small random variations. But light a fire under that kettle, and suddenly there will be more energy flowing into that water than radiating out, and the water is going to start getting hotter.

In short, radiative forcing is a direct measure of the amount that the Earth’s energy budget is out of balance.

While the concept is simple, the analysis required to figure out the actual value of this number for the Earth right now is much more complicated and difficult. Many different factors have an effect on this balancing act, and each has its own level of uncertainty and its own difficulties in being precisely measured. And the individual contributions to radiative forcing cannot simply be added together to get the total, because some of the factors overlap — for example, some different greenhouse gases absorb and emit at the same infrared wavelengths of radiation, so their combined warming effect is less than the sum of their individual effects.

In its most recent report in 2007, the IPCC produced the most comprehensive estimate to date of the overall radiative forcing affecting the Earth today. Ronald Prinn, the TEPCO Professor of Atmospheric Science and director of MIT's Center for Global Change Science, was one of the lead authors of that chapter of the IPCC's Fourth Assessment Report. Radiative forcing "was very small in the past, when global average temperatures were not rising or falling substantially," he explains.

[More... ]

Nature Magazine Cover Image
MIT News

Intense hurricane activity millions of years ago may have caused and sustained warmer climate conditions, new research suggests

A question central to research on global warming is how warmer temperatures caused by increased greenhouse gases could influence climate. Probing the past for clues about this potential effect, MIT and Yale climate scientists examined the Pliocene period, which began five million years ago and which some consider to be a potential analog to modern greenhouse conditions. They found that hurricanes influenced by weakened atmospheric circulation — possibly related to high levels of carbon dioxide — contributed to very warm temperatures in the Pacific Ocean, which in turn led to more frequent and intense hurricanes. The research indicates that Earth's climate may have multiple states based on this feedback cycle, meaning that the climate could change qualitatively in response to the effects of global warming.

Although scientists know that the early Pliocene had carbon dioxide concentrations similar to those of today, it has remained a mystery what caused the high levels of greenhouse gas and how the Pliocene's warm conditions, including an extensive warm pool in the Pacific Ocean and temperatures that were roughly 4 degrees C higher than today's, were maintained.

In a paper published Feb. 25 in Nature, Kerry Emanuel, the Breene M. Kerr Professor of Atmospheric Science in the Department of Earth, Atmospheric and Planetary Science, and two colleagues from Yale University's Department of Geology and Geophysics suggest that a positive feedback between tropical cyclones — commonly called hurricanes and typhoons — and the circulation in the Pacific could have been the mechanism that enabled the Pliocene's warm climate.

The Pliocene ended around three million years ago with the onset of large ice sheets in the Northern Hemisphere. There has been a slow reduction in carbon dioxide levels in the atmosphere for about 15 million years, and it is thought that the start of the glacial cycles was the climate's response once those levels reached a certain threshold, according to co-author Chris Brierley. While that level remains unknown, this research indicates that by increasing carbon dioxide levels, humans could reach the threshold that would induce a Pliocene-like climate.

More...

Video

Moderator: Ernest J. Moniz
Robert N. Stavins
Michael Greenstone
Stephen Ansolabehere
Edward S. Steinfeld
Henry D. Jacoby
John Sterman PhD '82

Joint Program Logo
MIT News

VIDEO of Climategate discussion
Event and participant details

At Dec. 10 forum, MIT faculty experts discussed what 'Climategate' really means for climate science and the ongoing policy negotiations in the Congress and at Copenhagen.

The leaked e-mails from the climate scientists at the University of East Anglia (UEA) have caused a huge public relations headache. The e-mails, in which the scientists seemed to question or even exaggerate their research, have provided new fodder for climate-change skeptics, who now feel they have "proof" that global warming is a hoax. The e-mails may be proof of nothing other than frustrated or impatient scientists, but the scientific community nonetheless has to deal with the fallout — the public’s increasing doubt about the validity of climate science, and maybe even doubt about scientific research in general.

"The Great Climategate Debate," held on Dec. 10 (see event and participant details) featured a panel of MIT faculty addressing the issues — both scientific and political — surrounding the controversy in the hopes that it wouldn't become the main focus of this month's United Nations Climate Change Conference in Copenhagen, Denmark. The debate involved the many issues raised by the scandal: the validity of climate science, the motives of the hackers who leaked the e-mails, the need for responsible scientific reporting and ethical standards, and the way the public interprets and understands information from the scientific community.

Kerry Emanuel, the Breene M. Kerr Professor of Meteorology, fears that Climategate was a "premeditated distraction from the main issues" of climate science. Emanuel said he's concerned with the identity and motives of the hackers, citing the very rich "machine" of global warming deniers as a possible culprit. As for the e-mails, he thinks they show nothing more than "humans — a few with failings. Mostly, it shows scientists hard at work."

Richard Lindzen, the A. P. Sloan Professor of Meteorology, believes the e-mails were leaked by a whistleblower within the Climate Research Unit who, like Lindzen, is disillusioned by what he sees as the scientific community’s one-sided argument over climate change. "Undeniably," he said, "we are dealing with issues illegal and unethical." The debate, he said, is whose illegal and unethical behavior is the larger issue: the scientists or the hackers?

Although Ron Prinn, the TEPCO Professor of Atmospheric Science and director of the Center for Global Change Science, thinks the e-mails are unethical and unprofessional, he believes this means little for the validity of the UEA scientists' studies. This controversy, he said, should not cause us to reevaluate our concern over global warming. Prinn notes that the larger body of evidence remains robust: The UEA scientists represent just a few of the many scientists from institutions around the world who are studying climate-change issues.

More...

Video

Moderator: Henry D. Jacoby
Kerry Emanuel '76, PhD '78
Judith Layzer PhD '99
Stephen Ansolabehere
Ronald G. Prinn SCD '71
Richard Lindzen

Joint Program Logo
3 Questions
MIT News

The co-director of MIT's Global Change program discusses what to expect from the U.N. Climate Change Conference, and the effects of 'Climategate'

Delegates from around the world began meeting this week in Copenhagen to try to work out a new U.N. pact to address global climate change. Henry Jacoby, co-director of the MIT Joint Program on the Science and Policy of Global Change and professor of management at the Sloan School of Management, talks about what to watch for at the December 7-18 conference, and what the repercussions may be from the recent release of hacked e-mails and other documents from the University of East Anglia relating to climate-change research. Climate-change skeptics have dubbed the affair “Climategate” and say the materials show a scientific conspiracy to exaggerate the risks of climate change. Many in the scientific community, however, say the release of documents represents a smear campaign.

Q: Expectations about the Copenhagen climate meeting seem to have been on a roller-coaster ride. What is your sense at this point of what will come of this meeting?

A: The original objective and expectation, back when the negotiating text for this meeting was agreed in Bali, was that they would have some kind of binding commitments by developing countries, some agreement to actions by developed countries, and agreement on financial transfers. That’s what they were supposed to do. We aren’t going to be able to do that, for a couple of reasons. [More... ]

Q. How serious are the revelations in the so-called “Climategate” release of e-mails, and what effect do you think that may have on Copenhagen or on other attempts to deal with climate-change issues?


A. There are several ways of thinking about that. Is it a serious challenge to the science of this issue? The answer is no. This is kind of a peek under the blanket of a discussion that went on 10 years ago, about the analysis of tree rings and other data, to try to reconstruct temperature histories over the last thousand years. The work led to the conclusion that the current temperature rise over the last 50 years is both unique in its pace, and has produced temperatures higher than we’ve seen in the last thousand years.

There has been a lot of analysis of that issue since, by other groups, reaching similar conclusions. Also, the basis of our work, as we develop our impression of the risk, does not depend on that data. It depends on much more firm temperature information from the last 150 years. So in terms of its effect on the science, I don't believe it's serious.

It is unfortunate, however, that this has an effect on politics in the U.S. It makes it appear that there's some conspiracy of scientists here. Scientists talk to each other in informal ways. A lot of words they use appear different in public than what they were intended to be. And to some degree this email file is being purposefully misinterpreted, creating an impression that's really unfortunate. But it is true that these scientists should have been more careful — they didn't understand, I think, when they were doing this original work, how important this would be in the political discussion. It provides ammunition to people who argue climate is not a problem, and confuses the public. How serious that is, I don't know. [More... ]

Q. How urgent is the need for action on climate change, in your view? That is, if the world fails to adopt specific, binding targets for reduction of greenhouse gases at this meeting, how serious could the consequences of that be?

A. This is a century-scale problem, so it's not exactly a matter of what you do this year. But we've been at this for 20 years, and we haven't done very much yet. What's important is to get started. We have a lot to learn about the costs of mitigation, and we have to learn even more about the climate system, but waiting to find out before taking action can be costly.

We need to do something to reduce the impact of human activities over a timescale of many decades, but the decades are going by. It's not crucial what we do in 2009 or 2010, but it's quite important that we get started on some serious measures to decrease emissions, and create the international structure, and domestic policies, to have some chance for sustained action over many decades. It’s just a matter of lifting one foot to take the first step now. Long-term targets, say for specific reductions by 2050, have their purpose in terms of motivating people. But the main thing is we’ve got to agree to do something in the short run, on critical issues like what the United States is going to do, and what the relationship is going to be between the developing and developed countries. So achievements this year or next year are not crucial, but failing to get the process on track would be very serious.

More...

Joint Program Logo
3 Questions
MIT News

MIT’s Joint Program on the Science and Policy of Global Change has pegged the annual cost of the proposed cap-and-trade legislation in Congress at $400 per U.S. household. But estimating the cost of doing nothing is far more difficult.

Sergey Paltsev, a principal research scientist in MIT’s Joint Program on the Science and Policy of Global Change, was the lead author of a recent report that analyzed the costs of climate legislation currently being debated in Congress. The analysis looked at the costs associated with the Waxman-Markey bill that was passed in June, and found the bill’s cap-and-trade provisions would have an average annual cost per U.S. household of $400. The study did not provide a comparison of what costs would be for a “no policy” case — in other words, the costs that would result from unmitigated climate change, or from other causes such as air or water pollution that might be associated with unregulated burning of fossil fuels.

Q: Have there been any changes proposed since the original bill was passed, or that are currently under discussion, that would make much of a difference in this cost estimate, one way or the other?

A: Currently, the already-passed Waxman-Markey bill and the Senate version, the Kerry-Boxer bill, are similar in emissions-reduction targets and total offsets. There are some minor differences, but unless major changes are proposed during the discussions in the Senate, the overall costs are similar. It should be noted that now the heat of the discussions are on the emission allowance allocation, which would determine who gets the emissions rights for free, who has to pay for them, and how the permit revenue will be spent. The outcome of this process would benefit or hurt certain industries or households of different income classes. The decisions about revenue allocation would affect who gains and who loses more, and as the stakes are high, there are many parties trying to influence the outcome. But the average economic burden, which is what we calculated, is not much affected by the allowance allocation.

Q: Apart from measures that are specifically being considered now, did your analysis suggest any different approaches, or modifications of the present proposal, that would bring about any significant reduction in these costs?

A: We have done other studies where we have considered issues related to the design of cap-and-trade or carbon tax systems. Ultimately, the cost of the policy is determined by the reduction targets, the possibility of banking or borrowing of permits over time, the amounts of offsets, and any additional measures directed at greenhouse gas reduction, such as renewable electricity standards, subsidies to carbon-free technologies, building standards, energy efficiency measures, etc. For the same reduction targets, overall costs are lower if there are fewer additional measures. However, these additional measures are popular because they allow hiding the true cost of the policy. For example, renewable electricity standards would reduce carbon price but increase the overall cost to the economy. As carbon price is a more visible indicator and overall cost is harder to measure, legislators might prefer to introduce such standards despite their economic inefficiency, simply because they create an illusion of achieving a target at a lower cost. At the same time, as I have already mentioned, distribution of allowance revenue could reduce the impact on, for example, low-income families or coal-producing regions, and we have a forthcoming study addressing this issue.

Q: Can you address how the costs that could result from a “no policy” case might compare with the costs of the proposed regulations?

A: In the case of “no climate policy,” I think it is more appropriate to talk about "damages" instead of “costs,” because there are some things that can be easily associated with dollar amounts and there are other things that are harder to quantify and to put a price tag on. At the MIT Joint Program we have done studies where we are trying to quantify the costs associated with the impacts of climate change on agriculture and coastal infrastructure, and of air pollution on human health. These are easier to quantify. However, there are many other important effects that cannot be convincingly put into a dollar measure, and for this reason we have not tried to estimate the economic and environmental effects of a no-policy path. Consider, for example, the main icon of a climate change — polar bears. How can one put an appropriate cost in dollar terms for a potential disappearance of polar bears due to melting Arctic ice? Or, as another example, on a coral bleaching due to increasing ocean temperature and acidification? Some people even argue that climate change is a strategic problem that should not be considered in terms of a traditional “benefit-cost” approach.

In our analysis of the Waxman-Markey bill we focus on estimating costs of the stated targets. We always stress that there are many uncertainties in our cost estimates and we try to quantify these uncertainties, but the uncertainties in the damages estimates are much larger.

Some people argue about yet another aspect of the problem. Societies have many important issues where resources are needed — to name just a few, a fight against hunger and poverty, improved access to medical facilities and education, fighting AIDS and malaria, and providing a better water supply. Climate change is an important problem, but is it diverting resources from other no-less-important problems? There are plenty of links between climate change, poverty, water supply, and diseases — but with scarce resources, is it better to focus on solving climate change or, for example, directly on fighting poverty? Obviously, we should try to do both. But where should the emphasis be? These are tough questions: How do we equate a potential loss of life of a polar bear with that of a hungry child in Africa now?

More...