News + Media
By Jennifer Chu, MIT News Office
SOURCE: NASA
If you’re planning to build that dream beach house along the East Coast of the United States, or would like to relocate to the Caribbean, a new study by economists and climate scientists suggests you may want to reconsider.
Researchers from MIT and Yale University have found that coastal regions of North America and the Caribbean, as well as East Asia, are most at risk for hurricane damage — a finding that may not surprise residents of such hurricane-prone communities. However, the researchers say by the year 2100, two factors could more than quadruple the economic damages caused by tropical storms in such regions and around the world: growing income and global warming.
In a paper published this week in Nature Climate Change, researchers developed a model to predict hurricanes around the world, looking at how hurricane activity might change in the next 100 years both with and without climate change.
Even in a world without climate change, where rates of greenhouse gas emissions remain stable, the researchers found that annual economic damages from hurricanes could double in the next century: Global population is expected to reach 9 billion by 2100, likely leading to more development along hurricane-prone coastlines. Given such growth, the researchers projected that worldwide annual damage from hurricanes — currently $26 billion — could increase to $56 billion in the next century.
Under a similar economic scenario, but with the added factor of climate change, the team found that annual hurricane damage could quadruple to $109 billion by 2100. According to the researchers’ model, proliferating greenhouse gases would likely increase the incidence of severe tropical cyclones and hurricanes, which would increase storm-related damage.
Furthermore, the researchers found that the distribution of damage is not even across the world. Their model indicates that climate change would cause the most hurricane-related damage in North America, followed by East Asia, Central America and the Caribbean. The rest of the world — particularly the Middle East, Europe and South America — would remain relatively unscathed, experiencing little to no hurricane activity.
Treading new territory
Kerry Emanuel, the Cecil and Ida Green Professor of Atmospheric Science at MIT, says results from the model developed by the team may have wide-ranging implications for regional planning and emergency preparedness.
“It could be used by lots of different people … to understand what resources to put into certain countries to mitigate or adapt to tropical cyclone changes resulting from climate change,” says Emanuel, a co-author of the paper. “For example, urban planners in cities might want to know how high to make the flood barriers if sea levels go up.”
Emanuel worked with researchers at Yale to develop the hurricane prediction model, an effort that combined two disparate disciplines: atmospheric modeling and economics. Emanuel describes the work as “treading new territory,” and the researchers had to “do a lot of back and forth to understand each other’s terminology.”
After sorting out semantics, the group set out to predict tropical cyclone and hurricane activity around the world. The researchers relied on four existing climate models commonly used by the Intergovernmental Panel on Climate Change to assess climate risks. Each of the models track and forecast certain climate variables such as wind, temperature, large-scale ocean currents and ocean temperatures. However, the models only track these variables at a relatively coarse resolution of 100 to 200 kilometers. Since a tropical cyclone that may whip into a massive hurricane under certain weather conditions requires resolutions of a few kilometers, using climate models to simulate storms is highly problematic.
Seeds of a cyclone
Instead, Emanuel and his colleagues embedded a tropical-cyclone model within each climate model. The combination allowed the team to see where storms may develop around the world, based on regional weather systems. The researchers randomly scattered hundreds of thousands of “seeds,” or potential tropical cyclones, throughout each of the four models, then ran the models to see where the seeds developed into significant storms. There was some variation between models, but in general, they revealed that 95 percent of storms simply dissipate, leaving 5 percent that were likely to turn into hurricanes under favorable conditions such as warm ocean water and high winds. They used enough seeds to generate 17,000 surviving storms in each simulation.
The team also looked at each country’s hurricane-related damage after adjusting for its gross domestic product (GDP). The researchers found that wealthier nations like the United States are able to absorb economic losses from a hurricane better than many others, such as island nations in the Caribbean.
“These are all small islands, and most of their GDPs are exposed,” Emanuel says. “In the United States , you take all this damage and divide it by the GDP of the whole country, and you get a smaller relative impact.”
Dan Osgood, a lead scientist in the financial instruments sector team for the Earth Institute at Columbia University, sees the new model as a useful tool, particularly for the insurance industry.
“Insurance companies [are] hungry for climate research such as this,” says Osgood, who was not involved in the research. “Having solid science, they can often offer more reasonable and more accurate prices, providing better deals to consumers, as well as accurate price incentives to help people [avoid] taking unreasonable building risks.”
The researchers stress that there was a fair amount of uncertainty in predictions made among the four climate models. For example, in estimating the effect of climate change on tropical-cyclone damage, the models’ predictions ranged from $14 billion to $80 billion a year.
Emanuel also points out that “looking at natural disasters strictly through an economic lens doesn’t tell you the whole story.” For example, despite a growing economy and population, if severe tropical cyclones become more frequent, people may choose to build elsewhere — a phenomenon Emanuel says an improved model will have to take into account.
Other authors on the paper are Robert Mendelsohn, Shun Chonabayashi and Laura Bakkensen from the Yale School of Forestry and Environmental Studies.
By: Vicki Ekstrom, Joint Program on the Science and Policy of Global Change

Regional climate policies depend largely on fiscal strategies and can have spiraling effects throughout the globe, a new MIT report further proves in the January edition of the Journal of Transport Economics and Policy. The report — titled “Biofuels, Climate Policy, and the European Vehicle Fleet” — uses the European transportation system as a test case and shows the significant impact various fiscal policies can have on emission reductions.
“The effectiveness of climate policies in isolation might depend crucially on the fiscal rules and environment,” says Sebastian Rausch, a co-author of the study and a research scientist at MIT’s Joint Program on the Science and Policy of Global Change. “So if you want to think about effective emissions-reduction policies and climate policies you have to take into consideration their interaction with other mechanisms like taxes and tariffs.”
For decades, Europeans have relied on diesel to power their cars. While better for the environment, these drivers have traditionally chosen diesel because higher taxes on gasoline make diesel the cheaper alternative. But now, Europe is encouraging its drivers to consider greener options. The European Union has imposed a renewable fuel mandate that requires 10 percent of fuel to be based in renewable sources like biodiesel or ethanol by 2020.
But will the higher price tag that often comes with renewables cause the mandate to have a negative effect? The MIT researchers say no. Studying the system with and without the mandate, they find that the number of drivers using diesel and biodiesel continues to increase with time because of rising oil prices and a tax system that balances out the additional expense of using renewables.
“So fueling up with biodiesel would still be 69 cents a gallon cheaper than gas,” Rausch says, “and it has the added benefit of reducing European emissions by about 8 percent by 2030.”
The report further analyzes the impact of tax or tariff changes, in combination with the imposed mandate. As one might expect, when gas and diesel have an equal tax rate almost a quarter fewer drivers choose diesel by 2030. The renewable fuel mandate also does not have a large impact on emissions because more drivers turn to gas. But if biodiesel and ethanol tariffs are removed, Europe can achieve significant emission reductions — about 45 percent — as these renewable fuels become cheaper to import and use. At the same time, diesel vehicles would all but disappear as ethanol blends crowd out the diesel market.
Looking at a global scale, the report shows that while renewable initiatives can cut emissions within that country, they can also cause spikes in emissions in other countries — or what is known as “leakage.”
Rausch explains: “You’re still driving a fair amount of diesel vehicles, but the fuel to drive those vehicles now comes from Brazil and other countries because you’ve removed your tariffs. You don’t have to produce as much diesel in the EU, so your emissions there are little bit lower. But the countries now producing more fuel to import to the EU see higher emissions.”
But there is still a positive side, Rausch says: “Because there’s a switch in imports from diesel to biofuels, emissions do get reduced in other countries as well because biofuel production releases fewer emissions than diesel production.”
These fuel changes in Europe can have a “snowballing effect,” Rausch says. Along with “leakage,” there can be other consequences. If Europe evens out its tax system, for example, increased demand for gasoline in Europe would drive up gasoline prices outside of Europe and lower gas consumption and emissions in general.
Automakers have made great strides in fuel efficiency in recent decades — but the mileage numbers of individual vehicles have barely increased. An MIT economist explains the conundrum.
By: Peter Dizikes, MIT News Office
Contrary to common perception, the major automakers have produced large increases in fuel efficiency through better technology in recent decades. There’s just one catch: All those advances have barely increased the mileage per gallon that autos actually achieve on the road.
Sound perplexing? This situation is the result of a trend newly quantified by MIT economist Christopher Knittel: Because automobiles are bigger and more powerful than they were three decades ago, major innovations in fuel efficiency have only produced minor gains in gas mileage.
Specifically, between 1980 and 2006, the average gas mileage of vehicles sold in the United States increased by slightly more than 15 percent — a relatively modest improvement. But during that time, Knittel has found, the average curb weight of those vehicles increased 26 percent, while their horsepower rose 107 percent. All factors being equal, fuel economy actually increased by 60 percent between 1980 and 2006, as Knittel shows in a new research paper, “Automobiles on Steroids,” just published in the American Economic Review (download PDF).
Thus if Americans today were driving cars of the same size and power that were typical in 1980, the country’s fleet of autos would have jumped from an average of about 23 miles per gallon (mpg) to roughly 37 mpg, well above the current average of around 27 mpg. Instead, Knittel says, “Most of that technological progress has gone into [compensating for] weight and horsepower.”
And considering that the transportation sector produces more than 30 percent of U.S. greenhouse gas emissions, turning that innovation into increased overall mileage would produce notable environmental benefits. For his part, Knittel thinks it is understandable that consumers would opt for large, powerful vehicles, and that the most logical way to reduce emissions is through an increased gas tax that leads consumers to value fuel efficiency more highly.
“When it comes to climate change, leaving the market alone isn’t going to lead to the efficient outcome,” Knittel says. “The right starting point is a gas tax.”
Giving the people what they want
While auto-industry critics have long called for new types of vehicles, such as gas-electric hybrids, Knittel’s research underscores the many ways that conventional internal-combustion engines have improved.
Among other innovations, as Knittel notes, efficient fuel-injection systems have replaced carburetors; most vehicles now have multiple camshafts (which control the valves in an engine), rather than just one, allowing for a smoother flow of fuel, air and exhaust in and out of engines; and variable-speed transmissions have let engines better regulate their revolutions per minute, saving fuel.
To be sure, the recent introduction of hybrids is also helping fleet-wide fuel efficiency. Of the thousands of autos Knittel scrutinized, the most fuel-efficient was the 2000 Honda Insight, the first hybrid model to enter mass production, at more than 70 mpg. (The least fuel-efficient car sold in the United States that Knittel found was the 1990 Lamborghini Countach, a high-end sports car that averaged fewer than nine mpg).
To conduct his study, Knittel drew upon data from the National Highway Transportation Safety Administration, auto manufacturers and trade journals. As those numbers showed, a major reason fleet-wide mileage has only slowly increased is that so many Americans have chosen to buy bigger, less fuel-efficient vehicles. In 1980, light trucks represented about 20 percent of passenger vehicles sold in the United States. By 2004, light trucks — including SUVs — accounted for 51 percent of passenger-vehicle sales.
“I find little fault with the auto manufacturers, because there has been no incentive to put technologies into overall fuel economy,” Knittel says. “Firms are going to give consumers what they want, and if gas prices are low, consumers are going to want big, fast cars.” And between 1980 and 2004, gas prices dropped by 30 percent when adjusted for inflation.
The road ahead
Knittel’s research has impressed other scholars in the field of environmental economics. “I think this is a very convincing and important paper,” says Severin Borenstein, a professor at the Haas School of Business at the University of California at Berkeley. “The fact that cars have muscled up rather than become more efficient in the last three decades is known, but Chris has done the most credible job of measuring that tradeoff.” Adds Borenstein: “This paper should get a lot of attention when policymakers are thinking about what is achievable in improved automobile fuel economy.”
Indeed, Knittel asserts, given consumer preferences in autos, larger changes in fleet-wide gas mileage will occur only when policies change, too. “It’s the policymakers’ responsibility to create a structure that leads to these technologies being put toward fuel economy,” he says.
Among environmental policy analysts, the notion of a surcharge on fuel is widely supported. “I think 98 percent of economists would say that we need higher gas taxes,” Knittel says.
Instead, the major policy advance in this area occurring under the current administration has been a mandated rise in CAFE standards, the Corporate Average Fuel Economy of cars and trucks. In July, President Barack Obama announced new standards calling for a fleet-wide average of 35.5 mpg by 2016, and 54.5 mpg by 2025.
According to Knittel’s calculations, the automakers could meet the new CAFE standards by simply maintaining the rate of technological innovation experienced since 1980 while reducing the weight and horsepower of the average vehicle sold by 25 percent. Alternately, Knittel notes, a shift back to the average weight and power seen in 1980, along with a continuation of the trend toward greater fuel efficiency, would lead to a fleet-wide average of 52 mpg by 2020.
That said, Knittel is skeptical that CAFE standards by themselves will have the impact a new gas tax would. Such mileage regulations, he says, “end up reducing the cost of driving. If you force people to buy more fuel-efficient cars through CAFE standards, you actually get what’s called ‘rebound,’ and they drive more than they would have.” A gas tax, he believes, would create demand for more fuel-efficient cars without as much rebound, the phenomenon through which greater efficiency leads to potentially greater consumption.
Fuel efficiency, Knittel says, has come a long way in recent decades. But when it comes to getting those advances to have an impact out on the road, there is still a long way to go.
By: Vicki Ekstrom, Joint Program on the Science and Policy of Global Change

Shale gas — a resource that has grown significantly in just the last few years to one-quarter of the domestic gas supply — is cheaper and involves fewer emissions than traditional coal or oil. But recent environmental concerns, combined with shale gas's important role in the global economy, have prompted the Obama administration and MIT researchers to investigate the resource and its potential impacts.
“People speak of [natural] gas as a bridge to the future, but there had better be something at the other end of the bridge,” Henry Jacoby, co-director emeritus of MIT’s Joint Program on the Science and Policy of Global Change, said earlier this year after co-authoring a report by the MIT Energy Initiative (MITEI) on The Future of Natural Gas.
Jacoby’s nagging thoughts prompted him and other researchers to further study shale gas and how its success could impact U.S. energy policy, including future technological development. Built on the MITEI study, the researchers' new report — The Influence of Shale Gas on U.S. Energy and Environmental Policy — is in this month's inaugural edition of the journal Economics of Energy and Environmental Policy.
“Prior to this we hadn’t compared U.S. gas production with and without shale,” Jacoby says of the new research. “This report makes that comparison. And we found much of what we already knew — which is a good thing — that shale makes a big difference. It helps lower gas prices, it stimulates the economy and it provides greater flexibility to ease the cutting of emissions. But it also suppresses renewables.”
The researchers came to these conclusions by considering what our nation would look like with shale and without shale under several policy scenarios. They found that gas prices would rise by about five times the current levels by 2050 without shale gas, under one scenario; electricity prices would also grow. But with shale gas, prices should only about double. The shale input also reduces electricity price growth by 5 percent in 2030 and 10 percent in 2045, compared to a scenario without shale gas.
A report released last month by IHS Global Insight, a global research firm commissioned by America’s Natural Gas Alliance, shows similar results. Prices would drop 10 percent in 2036 with shale, according to IHS, and the industry would add 870,000 U.S. jobs by 2015.
John Deutch, MIT professor and chair of a special U.S. Department of Energy panel studying shale, agrees with the significant economic contribution the shale industry can provide. Deutch, who was associated with the earlier MITEI report but not the new MIT study, said that the most recent employment estimates showed that there are three-quarters of a million jobs in the shale gas industry.
“More jobs are being created in Pennsylvania and Ohio by shale gas production than anything else that I’m aware of,” Deutch said at a recent MIT lecture, suggesting the significance of those two battleground states in U.S. elections.
“Over the last couple of years I’ve realized that what’s happening with unconventional natural gas [shale] is the biggest energy story that’s happened in the 40-plus years that I’ve been watching energy development in this country,” says Deutch, who served as undersecretary of the Department of Energy in the 1970s.
Shale’s low price tag is one of the reasons for its boom. For every $4 we pay for energy from natural gas, we pay $25 for oil, according to recent statistics from the U.S. Energy Information Administration.
Jacoby and Deutch agree this is not sustainable, and that there is a great incentive to continue to tap into the shale market — with Deutch calling shale “remarkably inexpensive” compared to other forms of natural gas.
This successful outlook has prompted some of the world’s leading oil companies to further invest in natural gas, and specifically shale gas production. Last month, Shell announced it would double gas production in North America in the next three years and that it has recently expanded its work to China.
But Jacoby warns, “Natural gas is a finite resource. We will eventually run into depletion and higher cost.” He adds, “It still releases greenhouse gas emissions. So if we’re going to get to a point where we strictly limit those emissions, we need renewables.”
The continued need for strong renewables prompts concerns, as the study finds that shale use suppresses the development of renewables. Under one scenario, for example, the researchers impose a renewable-fuel mandate. They find that, with shale, renewable use never goes beyond the 25 percent minimum standard they set — but when shale is removed from the market, renewables gain more ground.
These findings are significant in light of several concerns surrounding the unpredictable shale gas market and future environmental regulations.
One concern about shale gas extraction, and the most headline-grabbing concern, is that fluids from the gas production — a process called hydraulic fracturing, or simply fracking — could seep into and contaminate groundwater supplies. While the report found these concerns to be “overstated,” the Deutch shale panel said in November that “environmental issues need to be addressed now.”
This conclusion, along with uncertainties about how stringent greenhouse gas emission targets will be going forward, leaves the regulatory environment in question.
There’s also the concern that the global gas market is unpredictable because the shale revolution is still in its early stages.
Jacoby says the development of the industry in the United States is important because prices here are much cheaper than in other gas markets — namely, Europe and Asia. While we pay less than $4 per thousands of cubic feet, other markets pay up to $16. Because it is so much cheaper here, there’s the potential for us to become exporters.
But Jacoby calls this really a “matter of timing.”
“In the near term, our supplies are cheap enough that we should have the ability to export,” Jacoby says. “But over time, we likely won’t be able to compete with places like Russia and the Middle East that have lower costs, and eventually we’ll again turn to importing gas.”
Jacoby compares the global gas market to the oil industry. As shale resources are developed in places such as China, which recently announced that it was tapping at least 20 new reserves, prices will likely drop overseas and the United States will turn to cheaper imports as it has for oil.
An uncertain international gas market, an unpredictable regulatory environment with more stringent emission goals and decreasing natural gas reserves over time all point to the growing need to continue developing renewable technologies.
“Effective use of renewables, namely wind and solar, are still many years away,” Jacoby says. “How we tap into those resources and effectively work them into our electric grid still needs to be figured out. To get us there we need a robust R&D program so we’ll have renewable energies up and working effectively later in future decades when emissions regulations are stricter, and gas reserves are depleting.”
Shale might provide the flexibility to meet reduction targets at lower costs today, making it a strong “bridge” in the short term to a low-carbon future. But the report concludes that we can’t let “the greater ease of the near term … erode efforts to prepare a landing at the other end of the bridge.”

As the world’s population continues to expand, our natural resources will become increasingly strained. In an effort to find sustainable solutions for the planet’s growing population while minimizing environmental impacts, MIT’s Environmental Research Council (ERC) has put forward a detailed implementation plan to establish a Global Environmental Initiative to complement the MIT Energy Initiative (MITEI).
The interdisciplinary, faculty-led council presented the plan to the MIT community last Thursday in a forum held at the Kirsch Auditorium in the Stata Center. Council members outlined an initiative that would bring together MIT’s “core strengths” across campus to help solve the world’s pressing environmental challenges, from mitigating climate change to curbing contamination and maintaining fresh water supplies.
“It’s impossible to imagine a problem bigger and more compelling, or more suited to the strengths of MIT, than how to drive toward global sustainability,” said MIT President Susan Hockfield in a video address to the forum. “Far too often the public conversation about the environment and climate gets mired in the discourse of blame and despair. Today, I believe MIT has an opportunity, and frankly an obligation, to help replace that stalemate with the momentum of creative, realistic, positive change.”
Once launched, the Global Environment Initiative is expected to focus on cultivating six key areas of academic research throughout MIT: climate, oceans, water, ecological resilience, contamination mitigation and sustainable societies.
Dara Entekhabi, professor of civil and environmental engineering and chair of the ERC, says that while many researchers at MIT are working in the research themes identified in the plan, often these efforts occur in isolation. For example, a biologist studying the health effects of contaminants could give valuable input to chemists designing new materials. Or a mechanical engineer designing a water purification facility may benefit from an urban planner’s perspective. The environmental initiative will aim to identify and bring together such related efforts, foster technological and social innovations in all six environmental research themes, and identify strategic directions for growth.
In the areas of climate and oceans, MIT already has a strong foundation of interdisciplinary collaboration. The Center for Global Change Science, the Joint Program on the Science and Policy of Global Change, the Climate Modeling Initiative, and the recently launched Lorenz Center all focus on understanding the climate system and human contributions to that system. Similarly, in the area of ocean science, MIT has a long history of research and educational collaboration with the Woods Hole Oceanographic Institution (WHOI).
Going forward, the Global Environment Initiative would work to strengthen these existing efforts and identify new research priorities. For example, in the climate arena, the plan proposes increasing work devoted to reducing uncertainty in climate predictions. In the case of oceanic studies, the initiative would boost efforts to harness the potential of new data collection and analysis technologies to monitor the impacts of human activity.
In addition to strengthening existing environmental programs, Entekhabi says the initiative will plant the seeds for new cross-campus collaborations in the areas of water, ecological resilience, contamination mitigation and sustainable societies. Thursday’s forum highlighted work already underway in labs throughout MIT in these key areas.
For example, researchers across multiple departments are tackling various challenges related to water, from engineering portable desalinators and water-purifying membranes to analyzing greenhouse gas emissions from water treatment plants and designing city sidewalks that direct rainwater to green spaces. In the area of ecological resilience, biologists and geneticists are studying the central role microbes play in regulating the global environment. In an effort to mitigate future environmental contamination, chemists and material scientists are investigating ways to create environmentally “benign-by-design” products. And economists, social scientists and urban planners are envisioning ways to make societies more sustainable by examining global food supply chains, designing “green” buildings, and evaluating sustainable transportation and urban designs.
One of the initiative’s first goals, once launched, will be securing funding for graduate and postdoctoral fellowships, as well as ignition grants, to foster innovative, cross-disciplinary research projects that would otherwise struggle to attract initial funding from traditional sources. The Earth Systems Initiative, which Entekhabi currently heads, has had an ignition grant program in place to support new, high-risk projects in earth sciences. This program will likely form the foundation for similar efforts under the new initiative.
The initiative also lays out a plan for creating educational programs. “Simply put,” the plan reads, “incorporating an understanding of the linkages between environmental quality and human welfare must become an essential part of MIT’s basic educational message.” In this spirit, the initiative will host workshops and symposia, and support the development of a new undergraduate minor in environment and sustainability.
In describing the Global Environmental Initiative’s broad goals during last Thursday’s forum, Entekhabi drew a comparison with the history of medicine. He noted that in the last few decades, the use of trial-and-error methods in medicine, such as exploratory surgery and empirical drug discovery, has largely been replaced by advanced medical imaging and targeted drug synthesis.
“What we need to do for the environment is what we’ve done for our health, and our advanced medical practice,” Entekhabi said. “We need to replace trial-and-error with rational design. And that requires understanding fundamentally how the system works, in the same way as understanding fundamentally how human health works.”
The implementation plan for the Global Environmental Initiative is available for public review and comment until Feb. 10, 2012.
This talk will describe the tremendous potential benefits of shale gas and the environmental challenges posed by shale gas production. John Deutch will review the work of the Secretary of Energy Advisory Board Shale Gas Subcommittee, which he chaired, including the recommendations, the reasons for these recommendations, and the lessons to be learned from the experiences of this unusual advisory committee.
MIT report shows that with new policies, U.S. electric grid could handle expected influx of electric cars and wind and solar generation.

Over the next two decades, the U.S. electric grid will face unprecedented technological challenges stemming from the growth of distributed and intermittent new energy sources such as solar and wind power, as well as an expected influx of electric and hybrid vehicles that require frequent recharging. But a new MIT study concludes that — as long as some specific policy changes are made — the grid is most likely up to the challenge.
Study co-director Richard Schmalensee, the Howard W. Johnson Professor of Economics and Management at the MIT Sloan School of Management, says the two-year study came about “because a number of us were hearing two sorts of rhetoric” about the U.S. power grid: that it’s on the brink of widespread failure, or that simply installing some new technology could open up wonderful new opportunities.
“The most important broad finding was that both of these are false,” Schmalensee says. While the grid is not in any imminent danger, he says, “the current regulatory framework, largely established in the 1930s, is mismatched to today’s grid.” Moreover, he adds, today’s regulations are “highly unlikely [to] give us the grid of the future — a grid that by 2030 will support a range of new technologies and consumer services that will be essential for a strong and competitive U.S. economy.”
The report was commissioned by the MIT Energy Initiative (MITEI) and carried out by a panel of 13 faculty members from MIT and one from Harvard University, along with 10 graduate students and an advisory panel of 19 leaders from academia, industry and government.
While the grid’s performance is adequate today, decisions made now will shape that grid over the next 20 years. The MIT report recommends a series of changes in the regulatory environment to facilitate and exploit technological innovation. Among the report’s specific recommended changes: To enable the grid of the future — one capable of handling intermittent renewables — the United States will need effective and enhanced federal authority over decisions on the routing of new interstate transmission lines. This is especially needed, the report says, in cases where power is produced by solar or wind farms located far from where that power is to be used, requiring long-distance transmission lines to be built across multiple regulatory jurisdictions.
“It is a real issue, a chicken-and-egg problem,” says John Kassakian, a professor of electrical engineering at MIT and the study’s other co-chair. “Nobody’s going to build these new renewable energy plants unless they know there will be transmission lines to get the power to load centers. And nobody’s going to build transmission lines unless the difficulty of siting lines across multiple jurisdictions is eased.”
Currently, when new transmission lines cross state boundaries, each state involved — and federal agencies as well, if federal lands are crossed — can make its own decisions about permission for the siting of these lines, with no centralized authority.
“There are many people who can say no, and nobody who can say yes,” Schmalensee explains. “That’s strategically untenable, especially since some of these authorities would have little incentive ever to say yes.”
The MITEI report recommends that the Federal Energy Regulatory Commission (FERC) either be given the authority to make decisions in such cases, or be designated as the “backstop” authority in cases where there are disputes.
The grid would also benefit from a restructuring of the way customers pay for its costs, the study found. Payment for electric distribution, like payment for generation, is currently calculated based on usage. But most of the costs involved are fixed; they don’t depend on usage. This gives utilities incentives to resist distributed generation, such as homeowners installing rooftop solar panels, and gives consumers excessive incentives to install such systems — and thereby to shift their share of fixed network costs to their neighbors. Fixed network costs, the reports says, should be recovered primarily through customer charges that don’t depend on electricity consumption.
In addition, while many utilities have begun to install “smart meters” for their customers, most of these are not yet being used to provide feedback to customers that could shift electricity usage to off-peak hours.
“We haven’t done as much as we could to develop this capability, to learn how to do this,” Schmalensee says. “It could save everybody money, by cutting down the need to build new generators.” While overall growth in demand is expected to be modest and easily accommodated, without new policies peak demand will rise much faster, requiring new generating capacity. “We continue to build capacity that’s only used a few hours a year,” he says. Providing consumers with better price signals and the ability to play a more active role in managing their demand could significantly improve this imbalance, the report says.
Another area that will require restructuring, the study concluded, is cybersecurity: The more thoroughly the grid is interconnected, and the more smart meters are added to gather data about usage patterns, the greater the risk of security breaches or cyberattacks on the system.
At the moment, no agency has responsibility and authority for the entire grid. The report strongly recommends that some agency — perhaps the U.S. Department of Homeland Security — be given such responsibility and authority, but thorny issues related to authority over local distribution systems would need to be resolved. In addition, the report notes, it will be important to develop rules and systems to maintain the privacy of data on customers’ electricity usage.
Requiring the sharing of data, especially data collected as a result of federal investments through the American Recovery and Reinvestment Act of 2009, should be a significant priority, the report says. The government “spent a lot of money on pilot programs and experiments, and installations of a lot of new equipment that can improve the efficiency and reliability of the grid and the management of demand,” Kassakian says. But there needs to be more cooperation and communication about the results of those programs “in order to get the benefits,” he says.
In fact, widespread sharing of data from real-time monitoring of the grid could help prevent some failures before they happen, Kassakian says: “If you’re aware of what’s happening at the same time everywhere, you can observe trends, and see what might be an incipient failure. That’s very useful to know, and allows better control of the system.”
The MITEI study found that growth in the number of electric vehicles (EVs) on the road is likely to be slow enough, and widely distributed enough, that it shouldn’t create significant strain on the grid — although there may be a few locations where a particularly high penetration of such vehicles could require extra generating capacity. Some other effects could be subtle: For example, in some hot regions of the Southwest, grid components such as transformers are designed to cool off overnight when demand is ordinarily low. But a sudden influx of EVs charging at night could necessitate bigger transformers or cooling systems, while charging them at the end of the work day could significantly increase peak demand and thus the need for new capacity.
Utilities now spend very little on research, the study found, because regulators provide little incentive for them to do so. The report recommends that utilities put more money into research and development — both to make effective use of new technologies for monitoring and controlling the grid, and on customer response to pricing policies or incentives.
Panel discussion on the impact of climate change on agriculture and the food we eat.
AP, Seth Borenstein
WASHINGTON (AP) — Heat-trapping greenhouse gases in the atmosphere are building up so high, so fast, that some scientists now think the world can no longer limit global warming to the level world leaders have agreed upon as safe.
New figures from the U.N. weather agency Monday showed that the three biggest greenhouse gases not only reached record levels last year but were increasing at an ever-faster rate, despite efforts by many countries to reduce emissions.
As world leaders meet next week in South Africa to tackle the issue of climate change, several scientists said their projections show it is unlikely the world can hold warming to the target set by leaders just two years ago in Copenhagen.
"The growth rate is increasing every decade," said Jim Butler, director of the U.S. National Oceanic and Atmospheric Administration's Global Monitoring Division. "That's kind of scary."
Scientists can't say exactly what levels of greenhouse gases are safe, but some fear a continued rise in global temperatures will lead to irreversible melting of some of the world's ice sheets and a several-foot rise in sea levels over the centuries — the so-called tipping point.
The findings from the U.N. World Meteorological Organization are consistent with other grim reports issued recently. Earlier this month, figures from the U.S. Department of Energy showed that global carbon dioxide emissions in 2010 jumped by the highest one-year amount ever.
he WMO found that total carbon dioxide levels in 2010 hit 389 parts per million, up from 280 parts per million in 1750, before the start of the Industrial Revolution. Levels increased 1.5 ppm per year in the 1990s and 2.0 per year in the first decade of this century, and are now rising at a rate of 2.3 per year. The top two other greenhouse gases — methane and nitrous oxide — are also soaring.
The U.N. agency cited fossil fuel-burning, loss of forests that absorb CO2 and use of fertilizer as the main culprits.
Since 1990 — a year that international climate negotiators have set as a benchmark for emissions — the total heat-trapping force from all the major greenhouse gases has increased by 29 percent, according to NOAA.
The accelerating rise is happening despite the 1997 Kyoto agreement to cut emissions. Europe, Russia and Japan have about reached their targets under the treaty. But China, the U.S. and India are all increasing emissions. The treaty didn't require emission cuts from China and India because they are developing nations. The U.S. pulled out of the treaty in 2001, the Senate having never ratified it.
While scientists can't agree on what level of warming of the climate is considered dangerous, environmental activists have seized upon 350 parts per million as a target for carbon dioxide levels. The world pushed past that mark more than 20 years ago.
Governments have focused more on projected temperature increases rather than carbon levels. Since the mid-1990s, European governments have set a goal of limiting warming to slightly more than 2 degrees Fahrenheit (1.2 degrees Celsius) above current levels by the end of this century. The goal was part of a nonbinding agreement reached in Copenhagen in 2009 that was signed by the U.S. and other countries.
Temperatures have already risen about 1.4 degrees Fahrenheit (0.8 degrees Celsius) since pre-industrial times.
Massachusetts Institute of Technology professors Ron Prinn, Henry Jacoby and John Sterman said MIT's calculations show the world is unlikely to meet that two-degree goal now.
"There's very, very little chance," Prinn said. "One has to be pessimistic about making that absolute threshold." He added: "Maybe we've waited too long to do anything serious if two degrees is the danger level."
Click here to read the rest of the AP story.