CS3 In the News
By Brad Plumer
February 22, 2013
What’s the best way to curtail gasoline consumption? Economists tend to agree on the answer here: Higher gas taxes at the pump are more effective than stricter fuel-economy standards for cars and trucks.
Much more effective, in fact. A new paper from researchers at MIT’s Global Change program finds that higher gas taxes are “at least six to fourteen times” more cost-effective than stricter fuel-economy standards at reducing gasoline consumption.
Why is that? One of the study’s co-authors, Valerie Karplus, offers a basic breakdown here: Fuel-economy standards work slowly, as manufacturers start selling more efficient vehicles, and people retire their older cars and trucks. That turnover takes time. By contrast, a higher gas tax kicks in immediately, giving people incentives to drive less, carpool more, and buy more fuel-efficient vehicles as soon as possible.
A great deal also depends on whether biofuels and other alternative fuels are available. A tax on gasoline makes these alternative fuels more competitive, whereas fuel-economy standards don’t. “We see the steepest jump in economic cost between efficiency standards and the gasoline tax if we assume low-cost biofuels are available,” Karplus said in an MIT press release.
And yet… all this economic research never seems to have any effect on lawmakers. Since 2007, Congress and the Obama administration have moved to increase federal fuel economy standards, now scheduled to rise to 54.5 miles per gallon by 2025. According to the MIT estimates, this will cost the economy six times as much as simply raising the federal gas tax from its current level of 18.4 cents per gallon to 45 cents per gallon. Yet no one in Congress has even proposed the latter option.
One explanation is that the public just prefers things this way. Higher fuel-economy standards do impose costs, but they’re largely “hidden” costs — in the form of pricier vehicles in the showroom. A higher gas tax, by contrast, is visible every time people fill up at the pump.
In fact, a recent NBER paper by MIT’s Christopher Knittel found that this has been the case for decades. Between 1972 and 1980 the price of oil soared 650 percent. There was endless public debate during this period about how best to reduce reliance on fossil fuels. And, as Knittel discovered, the public consistently preferred price controls and fuel-economy standards over higher gas taxes. That was true no matter how often people were informed that gas taxes were the superior option.
“Given the saliency of rationing and vehicle taxes,” Knittel concluded, “it seems difficult to argue that these alternative polices were adopted because they hide their true costs.” In other words, the public seems to have an (expensive) preference for inefficient regulations over higher taxes to curb gasoline. Economists find it maddening, but it’s hard to change.
Further reading:
–On the other hand, if you want to see a rare economic argument for fuel-economy standards, check out this 2006 paper (pdf) by Christopher Knittel. He found that Americans were becoming less sensitive to fuel prices over time — which strengthened the case for policies like CAFE standards.
THE average price of gasoline in the United States, $3.78 on Thursday, has been steadily climbing for more than a month and is approaching the three previous post-recession peaks, in May 2011 and in April and September of last year.
But if our goal is to get Americans to drive less and use more fuel-efficient vehicles, and to reduce air pollution and the emission of greenhouse gases, gas prices need to be even higher. The current federal gasoline tax, 18.4 cents a gallon, has been essentially stable since 1993; in inflation-adjusted terms, it’s fallen by 40 percent since then.
Politicians of both parties understandably fear that raising the gas tax would enrage voters. It certainly wouldn’t make lives easier for struggling families. But the gasoline tax is a tool of energy and transportation policy, not social policy, like the minimum wage.
Instead of penalizing gasoline use, however, the Obama administration chose a familiar and politically easier path: raising fuel-efficiency standards for cars and light trucks. The White House said last year that the gas savings would be comparable to lowering the price of gasoline by $1 a gallon by 2025. But it will have no effect on the 230 million passenger vehicles now on the road.
Greater efficiency packs less of a psychological punch because consumers pay more only when they buy a new car. In contrast, motorists are reminded regularly of the price at the pump. But the new fuel-efficiency standards are far less efficient than raising gasoline prices.
In a paper published online this week in the journal Energy Economics, I and other scientists at the Massachusetts Institute of Technology estimate that the new standards will cost the economy on the whole — for the same reduction in gas use — at least six times more than a federal gas tax of roughly 45 cents per dollar of gasoline. That is because a gas tax provides immediate, direct incentives for drivers to reduce gasoline use, while the efficiency standards must squeeze the reduction out of new vehicles only. The new standards also encourage more driving, not less.
Other industrialized democracies have accepted much higher gas taxes as a price for roads and bridges and now depend on the revenue. In fact, Germany’s gas tax is 18 times higher than the United States’ (and seven times more if the average state gas tax is included). The federal gasoline tax contributed about $25 billion in revenues in 2009.
Raising the tax has generally succeeded only when it was sold as a way to lower the deficit or improve infrastructure or both. A 1-cent federal gasoline tax was created in 1932, during the Depression. In 1983, President Ronald Reagan raised the tax to 9 cents from 4 cents, calling it a “user fee” to finance transportation improvements. The tax rose again, to 14.1 cents in 1990, and to 18.4 cents in 1993, as part of deficit-reduction deals under President George Bush and President Bill Clinton.
A higher gas tax would help fix crumbling highways while also generating money that could help offset the impact on low- and middle-income families. Increasing the tax, as part of a bipartisan budget deal, with a clear explanation to the public of its role in lowering oil imports and improving our air and highways, could be among the most important energy decisions we make.
Valerie J. Karplus is a research scientist in the Joint Program on the Science and Policy of Global Change at M.I.T.
Read more about the study here. 
Related: Carbon Tax a 'Win-Win-Win' for America's Future
 
Nature: Natural hazards: New York vs the sea
By: Jeff Tollefson
February 13, 2013
In the wake of Hurricane Sandy, scientists and officials are trying to protect the largest US city from future floods.
Joe Leader's heart sank as he descended into the South Ferry subway station at the southern tip of Manhattan in New York. It was 8 p.m. on 29 October, and Hurricane Sandy had just made landfall some 150 kilometres south in New Jersey. As chief maintenance officer for the New York city subway system, Leader was out on patrol. He had hoped that the South Ferry station would be a refuge from the storm. Instead, he was greeted by wailing smoke alarms and the roar of gushing water. Three-quarters of the way down the final set of stairs, he pointed his flashlight into the darkness: seawater had already submerged the train platform and was rising a step every minute or two.
“Up until that moment,” Leader recalls, standing on the very same steps, “I thought we were going to be fine.”
Opened in 2009 at a cost of US$545 million, the South Ferry station is now a mess of peeling paint, broken escalators and corroded electrical equipment. Much of Manhattan has returned to normal, but this station, just blocks from one of the world's main financial hubs, could be out of service for 2–3 years. It is just one remnant of a coastal catastrophe wrought by the largest storm in New York's recorded history.
Sandy represents the most significant test yet of the city's claim to be an international leader on the climate front. Working with scientists over the past decade, New York has sought to gird itself against extreme weather and swelling seas and to curb emissions of greenhouse gases — a long-term planning process that few other cities have attempted. But Sandy laid bare the city's vulnerabilities, killing 43 people, leaving thousands homeless, causing an estimated $19 billion in public and private losses and paralysing the financial district. The New York Stock Exchange closed for the first time since 1888, when it was shut down by a massive blizzard.
As the humbled city begins to rebuild, scientists and engineers are trying to assess what happened during Sandy and what problems New York is likely to face in a warmer future. But in a dilemma that echoes wider debates about climate change, there is no consensus about the magnitude of the potential threats — and no agreement about how much the city should spend on coastal defences to reduce them.
On 6 December, during his first major public address after the storm, New York mayor Michael Bloomberg promised to reinvest wisely and to pursue long-term sustainability. But he warned: “We have to live in the real world and make tough decisions based on the costs and benefits.” And he noted that climate change poses threats not just from flooding but also from drought and heat waves. The city must be mindful, he said, “not to fight the last war and miss the new one ahead”.
Calculated risks
In the immediate aftermath of Sandy, lower Manhattan looked like a war zone. Each night, streams of refugees wielding flashlights wandered north out of the blackout zone, where flood waters had knocked out an electrical substation.
The storm devastated several other parts of the city as well. In Staten Island, pounding waves destroyed hundreds of homes, and one neighbourhood in Queens burned to ashes after water sparked an electrical fire. Power outages lasted for more than two weeks in parts of the city. Chastened by the flooding and acutely aware that Hurricane Irene, in 2011, was a near miss, the city is now wondering what comes next.
“Is there a new normal?” asks John Gilbert, chief operating officer of Rudin Management, which manages several office buildings in downtown New York. “And if so, what is it?” Gilbert says that the company is already taking action. At one of its buildings, which took on some 19 million litres of water, the company is moving electrical systems to the second floor. “You have to think that as it has happened, it could happen again,” he says. “And it could be worse.”
At Battery Park, near the South Ferry station, the storm surge from Sandy rose 2.75 metres above the mean high-water level — the highest since gauges were installed there in 1923. In a study published last week in Risk Analysis, researchers working with data from simulated storms concluded that a surge of that magnitude would be expected to hit Battery Park about once every 500 years in the current climate (J. C. J. H. Aerts et al. Risk Anal. http://dx.doi.org/10.1111/risa.12008; 2013).
But the study authors and other scientists say that the real risks may be higher. The study used flooding at Battery Park as a measure of hurricane severity, yet it also showed that some storms could cause less damage there and still hammer the city elsewhere. Factoring in those storms could drive up the probability estimates of major hurricane damage to New York.
The 1-in-500 estimate also does not take into account the unusual nature of Sandy. Dubbed a Frankenstorm, Sandy was a marriage of a tropical cyclone and a powerful winter snowstorm, and it veered into the New Jersey coast along with the high tide of a full Moon. “It was a hybrid storm,” says Kerry Emanuel, a hurricane researcher at the Massachusetts Institute of Technology (MIT) in Cambridge and one of the study's co-authors. “We need to understand how to assess the risks from hybrid events, and I'm not convinced that we do.”
The risks will only increase as the world warms. The New York City Panel on Climate Change's 2010 assessment suggests that local sea level could rise by 0.3–1.4 metres by 2080. Last year, Emanuel and his colleagues found that floods that occur once every 100 years in the current climate could happen every 3–20 years by the end of this century if sea level rises by 1 metre. What is classified as a '500-year' event today could come every 25–240 years (N. Lin et al. Nature Clim. Change 2, 462–467; 2012).
For city planners, the challenge is to rebuild and protect the city in the face of scientific uncertainty. A few scientists have said for more than a decade that the city should armour New York's harbour with a storm-surge barrier similar to the Thames barrier in London. In Sandy's wake, that idea has gained renewed interest, and a New York state panel last month called for a formal assessment of it.
Bridges and barriers
Malcolm Bowman, who heads the storm-surge modelling laboratory at the State University of New York at Stony Brook, has spearheaded the drive for barriers. He imagines a structure roughly 8 kilometres wide and 6 metres high at the entrance to the harbour, and a second barrier where the East River drains into the Long Island Sound. The state panel's cost estimates for such a system range from $7 billion to $29 billion, depending on the design. The harbour barrier could also serve as a bridge for trains and vehicles to the city's airports, suggests Bowman. “My viewpoint is not that we should start pouring concrete next week, but I do think we need to do the studies,” he says. But whether Sandy will push the city to build major defences, Bowman says, “I don't know.”
Disasters have spurred costly action in the past. The 1888 blizzard helped to drive New York to put its elevated commuter trains underground. And in 2012, the US Army Corps of Engineers completed a $1.1-billion surge barrier in New Orleans, Louisiana, as part of a $14.6-billion effort to protect the city after it was battered by hurricanes Katrina and Rita in 2005. But the New York metropolitan area is bigger and more complex than New Orleans, and protecting it will require a multi-pronged approach. Several hundred thousand city residents live along more than 800 kilometres of coastline, and a barrier would not protect much of coastal Long Island, where Sandy wrought considerable damage. Moreover, the barrier would work only against occasional storm surges. It would not hold back the slowly rising sea or protect against flooding caused by rain.
“A storm-surge barrier may be appropriate, but it's never one thing that is going to protect you,” says Adam Freed, a programme director at the Nature Conservancy in New York, who until late last year was deputy director of the city's office of long-term planning and sustainability. “It's going to be a holistic approach, including a lot of unsexy things like elevating electrical equipment out of the basement and providing more back-up generators.”
As part of that holistic effort, officials are exploring options for expanding the remaining bits of wetlands that once surrounded the city and buffered it from storms. In his address, Bloomberg called wetlands “perhaps the best natural barriers against storms that we have”.
But most of the city's wetlands have become prime real estate in recent decades, and Sandy made clear the consequences of developing those areas, says Marit Larson, director of wetlands and riparian restoration for the New York parks department.
A few weeks after the storm, Larson parks her car near the beach on Staten Island and looks out at a field of Phragmites australis, a common marsh reed. The field is part of Staten Island's 'Bluebelt' programme, initiated in the late 1980s to promote wetlands and better manage storm-water runoff. But the patch of wetlands here is smaller than a football pitch, and Sandy's surge rolled over it, damaging the nearby row houses. “If you look at the historical maps,” says Larson, “everything that used to be a wetland got wet.”
New York is now moving to strengthen its network of existing wetlands, which cover some 2,300–4,000 hectares. The mayor's budget plan for 2013–17 includes more than $200 million to restore wetlands as part of an effort to protect and redesign coastal developments.
Sandy also showed how proper construction can help to reduce risks from future storms. In one Staten Island neighbourhood, a battered roof rests on the ground, marking the spot where an ageing bungalow once stood. Next door, a newer house still stands, with no apparent damage apart from a flooded garage — sturdy proof of the value of modern building codes. In New York, newer buildings constructed in 100-year-flood zones, which are defined by the US Federal Emergency Management Agency (FEMA), cannot have any living spaces or major equipment, such as heating units, below the projected flood level (see 'Danger zone').
The city's zoning provisions could not protect against a storm like Sandy: officials estimate that two-thirds of the homes damaged by the storm were outside the 100-year-flood area. But scientists say that the FEMA flood maps were out of date, so even century-scale storms could cause damage well beyond the designated areas. Last month, FEMA began releasing new flood maps for the New York region that substantially expand this zone.
In their latest study, Emanuel and his colleagues estimate the average annual flood risk for New York as only $59 million to $129 million in direct damages. But costs could reach $5 billion for 100-year storms and $11 billion for 500-year storms. These figures do not include lost productivity or damage to major infrastructure, such as subways.
Bowman and other researchers argue that the city should commit to protecting all areas to a 500-year-flood standard, but not all the solutions are physical. A growing chorus of academics and government officials stress that the city must also bolster its response capacity and shore up the basic social services that help people to rebuild and recover.
Most importantly, the city and surrounding region need to develop a comprehensive strategy for defending the coastline, says Jeroen Aerts, a co-author of the Risk Analysis assessment who studies coastal-risk management at VU University in Amsterdam. Aerts is working with New York officials to analyse proposals for the barrier system and a suite of changes in urban planning, zoning and insurance. “You need a master plan,” he says.
“Ultimately, we all have to move together to higher ground.”
Seth Pinsky is working towards that goal. As president of the New York City Economic Development Corporation, he was tapped by Bloomberg to develop a comprehensive recovery plan that will make neighbourhoods and infrastructure safer. He points out that some newer waterfront parks and residential developments along the coast fared well during the storm. For example, at Arverne by the Sea, a housing complex in Queens, Pinsky says that units survived because they are elevated and set back from the water, with some protection from dunes. The buildings suffered little damage compared with surrounding areas.
Intelligent design
The cost of strengthening the city will be astronomical. In January, Congress approved some $60 billion to fund Sandy recovery efforts, with around $33 billion for longer-term investments, including infrastructure repair and construction by the Army Corps of Engineers. Pinsky says that he does not yet know how much of that money will go to New York, but he is sure it will not be enough. The city will define its budget in June, after his group has made its official recommendations. The rebuilding endeavour will probably necessitate a “creative” mix of public and private financing, he says. “It will probably require calling on a combination of almost every tactic that has been tried around the world.”
Even as he calls for more intelligent development, Pinsky says that New York is unlikely to take a drastic approach to dealing with storm surge and sea-level rise. “Retreating from the coastline of New York city both will not be necessary and is not really possible,” he says.
Given the sheer scale of development along the coast, it is hard to argue with Pinsky's assessment. But many climate scientists fear that bolstering coastal developments only delays the eventual reckoning and increases the likelihood of future disasters. The oceans will rise well into the future, they say, so cities will eventually be forced to accommodate the water.
“I don't see anything yet that looks towards long-term solutions,” says Klaus Jacob, a geoscientist at Columbia University's Lamont-Doherty Earth Observatory in Palisades, New York. But Jacob admits that he is as guilty as anyone. In 2003, he and his wife bought a home in a low-lying area on the Hudson River in Piermont, New York. Although it went against his professional principles, he agreed to the purchase with the assumption that he could elevate the house. But height-restriction laws prevented him from doing so, and Sandy flooded the house. The couple are now rebuilding.
“In a way, I think I was in denial about the risk,” Jacob says. He hopes that a new application to raise the house will be approved, but he still fears that the neighbourhood will not survive sea-level rise at the end of the century. New Yorkers and coastal residents everywhere would be wise to learn that lesson. “Ultimately,” Jacob says, “we all have to move together to higher ground.”
Nature 494, 162–164 (14 February 2013) doi:10.1038/494162a
IN THE NEWS: Changing with the climate 
MIT News
January 25, 2013
MIT researchers, Massachusetts officials highlight strategies to adapt to climate change.
Just days after President Obama called for action on climate change in his second inaugural address, members of Massachusetts Governor Deval Patrick’s administration joined energy and environment researchers at MIT to discuss strategies for adapting to climate change. The panel discussion on Jan. 23 fostered a continued partnership between MIT and the Commonwealth to advance energy and environment innovation. More....
IN THE NEWS: Reporter's Notebook: An inside tour of the MassDOT
The Tech
January 30, 2013
MIT students frequently use the T and other MassDOT transit systems; since 2010, our IDs even come with a built-in Charlie Card chip. But most students are unfamiliar with the inner workings of the transit system.
Ethan Feuer, Student Activities Coordinator for the MIT Energy Initiative, organized the tour for twenty five students in order to learn more about large infrastructures and emergency preparedness in cities. More....
IN THE NEWS: Climate Research Showcase
MITEI and MIT Joint Program
February 11, 2013
MIT students, researchers help Massachusetts address a post-Sandy world.
MIT students and researchers brought their latest ideas and findings to the table at an event on January 29. The interdisciplinary group of young researchers presented to officials from the Commonwealth’s Executive Office of Energy and the Environment, in hopes that the state would be able to leverage the information for future planning and implementation. More...
 
MIT students frequently use the T and other MassDOT transit systems; since 2010, our IDs even come with a built-in Charlie Card chip. But most students are unfamiliar with the inner workings of the transit system. I was excited to take advantage of one of the opportunities offered this IAP and take a tour of several MassDOT (Massachusetts Department of Transportation) facilities, including an underground ventilation tunnel system, bus operator training school, and the organizational headquarters for the T. 
MassDOT offers variations of this tour every other week to Boston residents. The locations on the tour change based on weather. Ethan Feuer, Student Activities Coordinator for the MIT Energy Initiative, organized the tour for twenty five students in order to learn more about large infrastructures and emergency preparedness in cities. 
Our tour was led by two MassDOT veterans, Adam Hurtubise, Assistant to the Highway Administrator at Massachusetts Department of Transportation, and Darrin McAuliffe, Director of Communications and Coordination. 
We boarded our privately-chartered MBTA bus and departed for our first stop: bus driver training school. Driving a 40 or 60-foot bus through the crowded streets of Boston is no easy task. The rigorous training program accepts applicants with a Certified Driver’s License permit, and begins testing them only eight days later to determine if they will qualify to become a bus operator. For their final exam, students must complete a serpentine maneuver, back up in a straight line, parallel park, and drive through the streets to the satisfaction of their examiner. 
As part of the training program, students are introduced to the feeling of the bus driver’s seat in a simulator. We were able to give the simulator a whirl. When I first entered the simulator cab, I was surprised by the size of the steering wheel. Making tight turns with the bus required not only excellent timing but also rapid spinning of the wheel. The size of the bus and the seemingly countless rearview mirrors were disorienting and meant I was never entirely sure where the back of my simulated bus was. I successfully right-turned and merged into traffic, only to hit a taxi seconds later as I tried to pull over to the bus stop. 
The bus instructors entertained themselves by introducing obstacles, such as ambulances and elderly pedestrians, into the simulated roadway, and by turning the roads icy or making it snow in the view screen. During one particularly unfortunate drive, they caused a boulder to roll into the middle of the road. After struggling with the simulator, I am much more impressed by the MBTA drivers’ ability to maneuver these behemoths. 
The next stop on our tour was Vent Building 4, one of 13 major ventilation buildings located throughout Boston. These buildings take in fresh air from above ground, pump it into roadway tunnels, and expel the exhaust-filled air from within the tunnel. This system is key to keeping the MassDOT Central Artery roadway tunnel system pleasant to drive through, and safe from smoke buildup in case of a fire. 
Some buildings are built around vent cores, including the upscale Intercontinental Hotel. Vent buildings can be identified by the large vents on the side of them, but the vents are designed to be inconspicuous and the building interiors are mostly unaffected. You might never guess that the basements of such buildings house several-story-high fans, backup generators and batteries, and tunnels that connect most of the city of Boston.
We visited the Haymarket T station vent building. Before beginning this part of our tour, they outfitted us in outrageous orange hard hats and vests, because we were going to see “live traffic coming at us.” 
According to Hurtubise, the Haymarket building has so much basement space that it is deeper underground than it is high. We took an elevator down into a chilly series of rooms made entirely of cement and lined with pump machinery and gauges (in case of “water infiltration,” said our guides), wandered past two large 8- and 12-cylinder diesel generators, which the city keeps in order to light the traffic tunnels in case of a power outage, through rooms containing large arrays of backup batteries in case the generators fail, until we came to a flight of stairs leading further down. The ceilings were very high, and at this point, we began to suspect the basements were even colder than the frigid 15 degree air at ground level. “
Congratulations,” said McAuliffe, as he directed us into an enormous room with fans the size of the MIT chapel lined up on one side, “you’ve found the coldest place in Boston.” 
We were in the supply plenum of the vent building. Every vent building has a supply and exhaust plenum. The supply plenum is full of fans to suck fresh air into the building. In the Haymarket plenum, we could stand in the center, look directly up, and see straight out of the skylight at the top of the building. We also visited the exhaust plenum, which was much darker, creepier, and more damaging to the lungs.
Our guides assured us the levels of carbon monoxide within the car tunnels are continuously monitored to maintain a safe level. The vent system can also react to smoke from a car fire by pressurizing one part of the tunnel more than the other in order to dispel the smoke. 
While in the plenum, our guides showed us a place where the room narrowed into a car-size tunnel. They explained such tunnels connect most of the vent buildings together, meaning you can travel across Boston via them, in a similar way to traveling through the MIT tunnels, although perhaps not quite as luxurious. Sometimes, said Hurtubise, the tunnels get so narrow you have to crawl. The vent building also connects directly to the car tunnel it ventilates. So, it was time for us to see some “live traffic.” 
Our guides opened a door which led to a narrow concrete platform in one of Boston’s car tunnels. I had seen maintenance doors countless times in traffic tunnels, but never imagined what was on the other side. From our position, we could look down to see cars driving through the tunnels and feel the freshly ventilated air blow into our faces. 
At the ventilation building, we visited one of the emergency systems MassDOT has in place in case of superstorms like Hurricane Sandy. The low-point pump room, the deepest part of the building, deals with any flooding that may occur in that section of the tunnels. We could see evidence of the most severe flood experienced in Vent Building 4: a water mark about three feet high on the walls. According to our guides, MassDOT is unsure of how its systems would be affected by a sudden rise in water level, such as Hurricane Sandy caused in New York, and is currently conducting a study on how much their infrastructure could handle. 
By now, we were ready to warm up and feel our extremities again, so we proceeded to the MassDOT Highway Operations Center. This office, housed on the second story of an inconspicuous office building, resulted from the merger of Massachusetts Highway Authority and the Turnpike Authority, which occurred during the formation of MassDOT in 2009. 
Most of the office was a single large room that resembled spy headquarters from an action movie. The back wall of the office displayed multiple video feeds from some of the 900 video cameras dispersed along the Massachusetts highway system. 
The Highway Operations Center monitors the video feeds with help from computer algorithms to identify traffic accidents and provide emergency responders with exact location and visual information. The cameras employ an accident-finding algorithm, which triggers an alert when one camera shows non-moving tail-lights, which means the camera is viewing the back-up behind an accident, or when a camera shows no traffic at all, which means the camera is trained on the roadway in front of an accident.  
The manager of the operations center, Michael Fitzpatrick, shared stories with us about incidents the office handles. The center has over-height vehicle detection systems, which alert when a truck that is too tall for a tunnel is en route to pass through it. They respond by flashing warnings on digital signs on the side of the road. Fitzpatrick said once a driver ignored the warnings and scraped a video camera off the tunnel ceiling. Police followed him in order to retrieve the camera, which was dangling from the back of his trailer. 
Being MIT students, we were especially interested to learn more about their computing systems. Another unique algorithm the Highway Operations Center developed works like the Google Maps traffic feature to track the speed of traffic. Sensors identify bluetooth devices in vehicles, mainly cell-phones, and record how long it takes the devices to go from checkpoint to checkpoint. Fitzpatrick explained the color coding on the traffic map. Since it was the middle of the day, most roadways were green; amusingly some stretches were blue, indicating the average car speed was above the posted speed limit. 
Although the Highway Operations Center uses some clever algorithms, several issues from the merger remain. According to Fitzpatrick, many of their monitoring and data-collection systems run on different platforms, so they do not communicate with each other. 
Our final stop on the tour continued to indulge our tech-oriented sides. We parked our bus outside of a inconspicuous office building. Most passerby did not give the building a second glance, but the security guard in the foyer made us realize this building was important. 
“No one really knows where this building is,” said Hurtubise. “We don’t advertise it.”  
We were inside the MBTA Operations Control Center, home to the logistics departments responsible for deploying T trains and MBTA buses. The operations centers for these two transit systems were located on separate floors.  
In the bus headquarters, we learned more about the role of MBTA buses. They respond to emergency situations, such as building evacuations or natural disasters, by providing buses for shelter or egress. Employees in this office were responsible for tracking the location of buses and making calls to drivers to keep them within five minutes of schedule.  
The T train operations center looked like the command center from a sci-fi ship. All the walls were painted black, employees sat at computers arranged on terraced platforms facing the front wall of the room. On this wall a huge projected graphic depicted the train lines, stops, and trains currently on the track. 
This tour left me amazed the with amount of detail MassDOT manages every day and great respect for its employees. Feuer called it a “wonderful” and “holistic” tour which covered many aspects of the MassDOT system. 
This is the first time the MIT Energy Initiative has organized such a tour with MassDOT. The tour fit in well with this month’s theme at the Energy Initiative, “Preparing for Climate Variability.” 
Due to the New York subway shut-down in the aftermath of Sandy, Feuer wanted to find out how prepared Massachusetts’ transportation systems are for such an event. 
Feuer said he was pleased by the feedback he received from both students and our tour guides, and, luckily for the many students on this tour’s waitlist, he hopes to do more tours in the future. 
“One student said it was a real highlight of his seven years at MIT,” said Feuer, “and Adam [Hurtubise] has told me we were the best group, that people are telling him we brought our A game.” 
This tour was a unique opportunity for students. As Feuer put it, “rarely do we get to see the underpinnings of public transit” and the “engineering marvels” involved.
By Brad Plumer
February 6, 2013
Like it or hate it, policymakers in Washington are still obsessed with the deficit. That’s why think tanks keep churning out clever plans to cut spending and raise taxes.
And here’s a new paper from the Council on Foreign Relations offering an interesting twist on the theme. Using economic modeling, Michael Levi and Citgroup’s Daniel Ahn suggest that a tax on oil consumption could be one of the least harmful ways to trim the budget deficit.
How do they figure? Levi and Ahn first assume that Congress will enact a big deficit-reduction package over the next 10 years that cuts spending by 3 percent of GDP by 2020 and raises corporate and income taxes by 1 percent of GDP by 2020. That may be unlikely in the real world, but it’s fairly similar to the much-discussed Simpson-Bowles proposal.
Next, the authors look at what would happen if Congress scrapped some of those tax hikes and spending cuts and instead replaced them with a tax on oil consumption. This would could involve simply raising existing taxes on gasoline, diesel fuel, and jet fuel. They assume the oil tax would be phased in over time and come to about $50 per barrel of crude oil in 2020, or an extra $1.20 per gallon of gasoline.
After running their economic model, Levi and Ahn found that using the oil tax to fend off some of the spending cuts and income tax hikes could be beneficial to the U.S. economy. In other words, a deficit package with an oil tax could be less harmful than a deficit package without one. Here’s the key chart:
In Variation 1, the gold line, the oil tax is used to restore part of the government spending cuts in the big deficit-reduction deal. In Variation 2, the blue line, the oil tax is used to restore part of the spending cuts and keep taxes lower. In Variation 3, the red line, the oil tax revenue is used to keep income and corporate tax rates at their current levels.
The end result: The U.S. economy performs better when there’s oil tax revenue to fend off spending cuts and tax hikes. GDP rises faster and unemployment falls further.
Why might this be? For one, Levi explained in a phone interview, a portion of the oil tax would fall on foreign countries, since the United States still imports about 40 percent of its crude. What’s more, oil in the United States is relatively lightly taxed. “Raising taxes on something that’s under-taxed, like oil, rather than something that’s already heavily taxed, like income, can yield good results,” Levi said.
Of course, this is a rather simplistic scenario, and Levi and Ahn model a few other possibilities in their full paper (pdf). For instance, it’s quite possible that an oil tax would curb U.S. fuel consumption, which might in turn lower global oil prices. (Though that’s hardly certain; a lot would depend on how OPEC responded.) In that case, the U.S. economy could see a slightly bigger boost.
Meanwhile, there are distributional consequences to consider. An oil tax is likely to be quite regressive — many poorer Americans spend a greater fraction of their income on gasoline. So Levi and Ahn looked at what would happen if half of the oil tax revenue was kicked back to consumers as lump-sum rebates, while the other half was used to reduce taxes and maintain spending levels. Even in that case, the economy performs better than it does under a standard deficit-reduction plan.
In theory, a tax on oil could have other benefits as well — if it reduces domestic fuel consumption, that would make the U.S. economy less vulnerable to large swings in global oil prices. But those benefits aren’t factored in here.
Last year, a similar study from MIT looked at the effects of using a broader carbon tax to trim the deficit. That study found that carbon taxes only offered a slight advantage over other budget-cutting measures. But there’s an important difference here — unlike the MIT study, Levi and Ahn’s paper doesn’t assume that the U.S. economy will be running at full employment anytime soon. And in that case, finding ways to blunt the impact of deficit reduction over the next 10 years could have a big effect on the course of the economy.
By Henry (Jake) Jacoby
Introduction
Mitigating climate change doesn’t sound as monumental as ending, or reversing climate change. But with global phenomenon already “contributing to the deaths of nearly 400,000 people a year and costing the world more than $1.2 trillion… annually,” according to the Climate Vulnerability Monitor, MIT professor Henry “Jake” Jacoby explains why efforts to mitigate climate change may be crucial to determining the next generation’s quality of life.
Henry (“Jake”) Jacoby is William F. Pounds professor emeritus in the MIT Sloan School of Management, and former co-director of the MIT Joint Program on the Science and Policy of Global Change.
Talking about mitigating climate change risk is a bit like the story of the man arrested for murder whose lawyer said to him, “I’ve got good news and bad news. The bad news is the blood found at the crime scene matches your DNA. The good news is your cholesterol level is down to 160.”
First, the bad news about climate change: The quantity of greenhouse gases humans have pumped into the atmosphere since the dawn of the industrial age is already changing the earth’s climate and raising global temperatures. What’s not widely recognized is that simply stabilizing global greenhouse gas emissions at today’s levels will not stabilize their atmospheric concentrations and effects on climate. Much deeper cuts will be required. Moreover, even if we succeed in reducing future emissions drastically, our children and grandchildren will have to live with the consequences of global warming –not just higher temperatures, but more severe storms, sea level rise, fire, drought and other environmental changes.
With no additional mitigation policy, we estimate there’s about a 50/50 chance that global temperatures will rise by as much as 5 degrees Celsius by the end of this century. There’s almost a one in four chance global temperatures will rise by 6 C or more.
Over the past two decades, diplomats have tried to negotiate a deal to limit atmospheric concentrations of “Kyoto gases” (carbon dioxide, methane, nitrous oxide and other industrial gases). Their ultimate goal: to curb global temperature increases to 2 C by the year 2100. However, after analyzing the data, the objective looks daunting.
Whatever we and the other nations do, climate change will adversely affect future generations. By steadily pressing ahead to create a non-carbon-based economy by whatever means available, we can limit the damage.
For example, one specific target is to limit atmospheric concentrations to about 450 parts per million (ppm). But given that the concentration of these gases has already risen from 275 ppm in the late 18th century to around 440 ppm today, and is climbing steadily, it’s doubtful we can achieve that goal.
But all is not lost. According to our calculations at MIT’s Joint Program on the Science and Policy of Global Change, even if we limit atmospheric concentrations of the Kyoto gases to a more modest 650 ppm, the high-end risks of climate change — temperature increases of 5 to 7 C — disappear. In other words, our grandchildren would still have to live with the disruptive effects of climate change, but they wouldn’t have to face the most catastrophic scenarios.
That’s the good news about climate change: almost anything we do to limit greenhouse gas emissions has its biggest effect on the worst possible outcomes. That’s why it’s worth keeping up the fight to reduce greenhouse gas emissions even though some targets will be hard to meet.
Here in the U.S., despite the national gridlock on climate change policy, energy-related carbon dioxide emissions have dropped in recent years — in part because of the recession but also due to a shift from coal to natural gas as a power source. The fact that President Obama talked about climate change in his inaugural address, plus the effect of recent storms and drought on public understanding of the risk, may help shift the debate toward a more aggressive policy response.
Right now, the U.S. has a cobbled together quilt of state, regional and national policies — automobile mileage standards, appliance efficiency ratings, renewable energy subsidies — that indirectly limit greenhouse gas emissions. From an economic perspective, the cheapest and best way to reduce emissions would be with a carbon tax, or a cap-and-trade system that places a price on carbon emissions. It’s a win/win/win solution. It would 1.) lower greenhouse gases emissions and oil imports, 2.) increase revenue which could be used to cut other taxes, and 3.) have a neutral-to-positive effect on economic growth. If a price penalty for emitting greenhouse gases is not politically feasible, then more expensive regulatory measures are going to be the way forward.
Whatever we and the other nations do, climate change will adversely affect future generations. By steadily pressing ahead to create a non-carbon-based economy by whatever means available, we can limit the damage.
Paint Pigment, Violent Raccoons and Other Surprising Mercury Trivia
Delegates gathered in Geneva this week to negotiate for a global treaty to regulate the toxic chemical mercury.
I want to call your attention to a blog on "issues relevant to mercury pollution," run collectively by a group of MIT graduate students. They have been attending the United Nations talks on mercury in Geneva, Switzerland, which are due to wrap up today. Their posts are clever, funny and packed with interesting facts.
For example, did you know that the use of mercury dates as far back as 5,000 B.C.? In Spain, the Romans relied on slave labor to mine mercury, which they used as pigment in their paint. In fact, mercury-laden paint was found in homes "buried by the volcanic ash of Mount Vesuvius in 79 A.D.," the post reads. Cool, huh?
In the blog, the students document mercury's presence in popular culture, with a nod to a 1979 horror film called "Prophecy" in which "mercury waste from a logging company creates violent raccoons, salmon large enough to eat a duck and, worst of all, a giant bear-monster that may also be a reincarnated, evil forest spirit." Here's a clip.
They write of the not-so-subtle music choices -- namely Queen's "Under Pressure," which has been broadcast repeatedly over the conference loudspeaker. After noting the Freddie Mercury connection, they suggest a more appropriate playlist, which includes "Running out of Time" by Hot Hot Heat and "Mercury Poisoning" by Graham Parker and the Rumour.
And there's this gem, written from the perspective of a mercury atom floating in a delegate's water bottle.
And most importantly, they address how little people know about mercury and the risks it poses to the environment and human health. Neurological problems, memory loss and kidney, thyroid and pulmonary system problems can occur as a result of exposure to high concentrations. Vaporized mercury can easily pass from your lungs into your blood stream and damage tissues, according to this post. A growing body of evidence, this post notes, indicates a causal relationship between methylmercury and cardiovascular disease, such as heart attacks and increased blood pressure. People working in mercury mining and refining, thermometer production, dentistry, and in the production of mercury-based chemicals are at increased risk.
Plus, here's mercury in seafood, explained.
In brief, mercury is methylated to methylmercury (CH3HgX) by bacteria in the ocean and then accumulates in fish and marine mammals. Long-lived predatory fish at the top of the food-chain, such as swordfish, tilefish, shark, and tuna, can accumulate dangerously high concentrations of mercury. The US EPA lists guidelines for safe consumption of fish. Women who are pregnant or who could become pregnant should be especially careful about eating mercury contaminated fish because the mercury can be harmful to the developing fetus.
Big deal, little fanfare over global pact on mercury controls
By Carol J. Williams
January 23, 2013
It’s a highly toxic element that travels the world in mysterious ways, respects neither manmade nor natural boundaries and rapidly accumulates in people and the food they eat.
Mercury’s risks for human and environmental health have slowly but steadily come to light over the centuries, leading to ad hoc phase-outs of mercury-filled thermometers, dental amalgam and the felt-hat-shaping compound that caused brain damage in 19th century milliners, giving rise to the term “mad as a hatter.”
U.S. and European governments have invoked strict regulations in recent decades to reduce mercury emissions. But fresh research by the United Nations Environmental Program and U.S. and European scientists has documented a concurrent rise in mercury emissions in Asia, Africa and the Arctic Ocean region, underscoring that mercury is a global problem in need of a collaborative solution.
A legally binding agreement to reduce emissions that was reached this past weekend at U.N.-sponsored talks in Geneva drew little notice or fanfare, probably because it still faces the rigors of ratification in 140-plus countries that will take another two to four years.
Still, getting so many states with competing economic agendas and disparate means to commit to the plan was no small feat--and not a minute too soon, in the view of environmental advocates spooked by mounting evidence of mercury’s dangers.
A European Union-coordinated study of 4,000 residents in 17 countries over the last two years found mercury levels in one-third of the test group to be above the amount considered safe, suggesting a causal link with brain damage in newborns.
“Mercury has been known as a toxin and a hazard for centuries, but today we have many of the alternative technologies and processes needed to reduce the risks for tens of millions of people, including pregnant mothers and their babies,” said Achim Steiner, the U.N. Environment Program chief, in heralding the successful conclusion of the decade-long International Negotiating Committee on Mercury.
Studies released by the U.N. agency ahead of the culminating negotiations lent urgency to the forum’s mission. In the world body’s "Global Mercury Assessment 2013," emissions of toxic metals from artisanal gold mining were shown to have doubled since 2005. Researchers attributed some of the rise to more thorough reporting from developing nations, but blamed more of it on the lure of record prices commanded for the precious metal.
A separate U.N. study said coal burning was responsible for about 24% of mercury emissions globally each year, with a heavy concentration in Asia, where smokestacks lack the emissions-scrubbing equipment widely used in North America and Europe.
Franz Perrez, international affairs division chief for Switzerland’s environmental office, attributed the unusual unity of purpose that secured the mercury pact in Geneva to a forum less subject to the rich-poor divides bedeviling the world body’s pursuit of a climate change treaty.
“There are some differences over financing and burden-sharing and over the compliance mechanism, but these are typical,” said Perrez, adding that he heard nothing to suggest ratification would be a problem.
Horse-trading remains to be done on helping developing countries switch to technologies that capture mercury emissions at the source and phase out antiquated and dangerous mining practices, said Noelle Selin, a professor of engineering systems and atmospheric chemistry at the Massachusetts Institute of Technology.
Mercury controls have been practiced on a voluntary basis by leading industrial countries in recent years, but “the advantage of having a treaty is that it is a strong legal statement that mercury is a problem and sets guidelines and timelines for reducing its major sources,” Selin said in a phone interview from the Geneva forum.
Selin and Harvard University colleague Elsie Sunderland published an appeal in the journal Environmental Health this month for aggressive emissions reductions and pointed to the European Union study showing as many as 2 million children born on the Continent each year with mercury-induced IQ deficiencies. The study calculated that the mental health damage costs European society $12 billion a year in lost income.
How mercury that has accumulated in the environment for millennia migrates the globe and transforms as it mixes with air, soil and water isn’t well understood, said James Hurley, director of the University of Wisconsin Aquatic Sciences Center. But an experiment he conducted over the last seven years found that new emissions from industrial activities and mercury released from melting Arctic ice and thawing permafrost were far more hazardous to the food chain than deposits in the ground.
Mercury released from coal-fired plants that falls into lakes or watersheds converts to methylmercury in water and is absorbed first by plankton, then by the fish that feed on it. To determine how quickly the element enters the food chain, Hurley put about three times the amount of mercury into one of Canada’s Experimental Lakes as would naturally make its way into the water body from rainfall and adjacent wetlands.
“We got a rapid response to new mercury added to the lake,” Hurley said. “More and more of the stable isotope kept accumulating in predator fish.”
Even more significant, he said, was the finding that as soon as researchers ceased adding mercury to the lake, absorption of it in fish responded with a parallel decline.
“By eliminating the amount of mercury in the atmosphere, we hopefully will be seeing improvement in mercury levels in fish,” Hurley said, predicting what passes for rapid rewards in environmental recovery if the global compact on mercury regulations moves ahead.
A foreign correspondent for 25 years, Carol J. Williams traveled to and reported from more than 80 countries in Europe, Asia, the Middle East and Latin America.
As United Nations delegates end their mercury treaty talks today, scientists warn that ongoing emissions are more of a threat to food webs than the mercury already in the environment.
At the same time, climate change is likely to alter food webs and patterns of mercury transport in places such as the Arctic, which will further complicate efforts to keep the contaminant out of people and their food.
University of Wisconsin researchers recently found that mercury added to a lake reached top predators faster than the mercury that already existed in their environment.
“It was amazing how fast the mercury got into the fish,” said James Hurley, project researcher and director of the university’s Water Resources Institute in Madison.
And this was no lab experiment – researchers put mercury into Lake 658, part of the Experimental Lakes area in Ontario, Canada. Over a year, they put about three times the amount normally received through rainfall and nearby wetlands.
For mercury to show up in top lake predators, it has to be converted to methylmercury – mercury’s toxic form -- by organisms. Then it has to move up through the food web.
At Lake 658, this happened within months.
“We started seeing the isotope we added in June accumulate in yellow perch by early fall,” Hurley said. “By the start of the second year, we were clearly seeing it even in predatory fish.”
Before this study, researchers didn’t have any idea about how long it took for mercury to move through the environment, said David Krabbenhoft, a research hydrologist with the U.S. Geological Survey’s Wisconsin Water Science Center.
Once researchers stopped adding mercury, the concentrations in fish dropped quickly.
The discovery that new mercury seems to be more of a threat than old mercury could add impetus for reducing global emissions. Critics of mercury rules often say that because mercury is an element that recirculates, new emissions have minimal impact compared with historic and natural ones.
The United Nations today adjourns a meeting in Geneva where governments of about 130 nations have been debating a mercury reduction treaty.
Asia is by far the largest source of new mercury emissions, and coal-burning power plants are the top contributor. Small-scale gold production and residential heating from other fossil fuels are other major sources.
Exposure to high levels of mercury, often from consumption of fish and other seafood, can damage developing brains, reducing children’s IQs. It also has been linked to cardiovascular effects in some adults and children.
The UN released a report leading up to the conference that showed the amount of mercury in the world’s oceans has doubled in the past century.
And global emissions are rising. An increase equivalent to about one-quarter of the 2005 human-caused mercury emissions, or about 500 tons per year, is expected by 2020 if there are no major changes in economic trends or emissions, according to a 2011 report by the Arctic Monitoring and Assessment Programme.
Climate change complicates transport
Human-driven emissions of another kind – carbon – are expected to further complicate how mercury makes its way around the planet, especially in the Arctic.
Since 1979, average Arctic sea ice has declined about 7.5 percent per decade. Loss of ice would mean more mercury in the air would land directly on water, instead of bouncing back as a gas. Conversely, the waters may purge more mercury as a gas. The net effect of these two factors is unknown.
“Thawing permafrost is already releasing significant masses of largely inorganic mercury to lakes and the Arctic Ocean,” wrote the authors of a 2011 study from Canada’s Freshwater Institute.
Warmer water coupled with the increased nutrients from permafrost and soil runoff could bolster aquatic life. More bacteria would hasten mercury turning into its dangerous form.
Harvard researchers found that twice as much of the mercury in the Arctic Ocean originates from the rivers as from the atmosphere, according to a 2012 study.
"At this point we can only speculate as to how the mercury enters the river systems, but it appears that climate change may play a large role," said Daniel Jacob, a co-author of the study, in a prepared statement. "As global temperatures rise, we begin to see areas of permafrost thawing and releasing mercury that was locked in the soil.”
Climate could alter the feeding habits of ocean creatures. A longer food chain for top predators such as polar bears, belugas and walruses means they would be more highly exposed to mercury, since it magnifies each step up.
For some fish, temperature change would bolster growth rates, decreasing mercury accumulation. For other cold-loving fish, such as char and lake trout, growth could be stunted, increasing their mercury concentrations.
Persistent in food webs, people
Living in a region that acts as a sink for global pollutants and relying on wildlife for their diet, Arctic people have long been exposed to some of the highest levels of mercury.
Inuit pregnant women, mothers and women of childbearing age had about seven times more mercury in their blood than what the U.S. Environmental Protection Agency says will cause health problems for their children, according to a 2011 study. In some parts of Greenland, about 90 percent of women of childbearing age had blood mercury levels over the EPA’s limit.
Mercury has been linked to attention problems, reduced IQs and altered heart rates in children living in the Arctic and sub-Arctic.
But it’s not just a problem in the far north.
Between 1.5 and 2 million European children are born each year with mercury exposures above what the World Health Organization considers safe, according to a study in this month’s Environmental Health journal. Human health impacts are the reason an emissions treaty is so vital, experts say.
“For ocean fish and people eating them, it may take decades to see the benefits,” said Noelle Selin, an engineering professor at the Massachusetts Institute of Technology. “But without a treaty, things are only going to get worse.”
David Streets, a senior scientist who studies historic mercury emissions at the Argonne National Laboratory in Illinois, said mercury emissions have gone down in the United States and Europe, but a rush in coal use in some fast-growing countries like China, and a resurgence of artisanal gold mining in places like Africa, is offsetting the reductions.
In 2005, the top emitter of human-caused mercury was Asia, at 65 percent of global emissions. Next highest was North America at 8.3 percent, according to U.N. data.
“This stuff cycles around so much, comes to the ground, goes back into the air, gets in people,” Streets said. “A treaty is a good start.”
Using available control technologies for coal, global mercury emissions could be reduced by up to 60 percent by 2020 compared with today’s practices, according to the 2011 Arctic Monitoring and Assessment Programme report.
Despite climate question marks, Hurley points to his recent research as evidence of what decreasing emissions could do.
“Global reductions would mean less mercury in fish, lakes and people,” Hurley said. “And, as we demonstrated, it would happen pretty quickly.”
By Juliet Eilperin
As winter begins to tighten its grip on much of the United States, air conditioning doesn’t seem like much of a survival strategy. But a new study has found that home air conditioning played a key role in reducing American death rates over the past half-century, by keeping people cool on extremely hot days.
The installation of air conditioning in American homes is the reason why the chances of dying on an extremely hot day fell 80 percent over the past half-century, according to an analysis by a team of American researchers.
The findings, based on a comprehensive analysis of U.S. mortality records dating from 1900, suggests the spread of air conditioning in the developing world could play a major role in preventing future heat-related deaths linked to climate change. Very few U.S. homes had air conditioning before 1960; by 2004, that figure had climbed to 85 percent.
A team of researchers from Tulane University, Carnegie Mellon University, the National Bureau of Economic Research and the Massachusetts Institute of Technology examined patterns in heat-related deaths between 1900 and 2004. The group found that days on which temperatures rose above 90 degrees Fahrenheit accounted for about 600 premature deaths annually between 1960 and 2004, one-sixth as many as would have occurred under pre-1960 conditions.
“It’s all due to air conditioning,” said MIT environmental economics professor Michael Greenstone, one of the paper’s co-authors, adding that factors including increased electrification and health-care access did not affect heat-related mortality.
The likelihood of a premature death on an extremely hot day between 1929 and 1959 was 2.5 percent, the academics found, dropping to less than 0.5 percent after 1960. The paper, which is under review at an academic journal, compared days on which temperatures exceeded 90 degrees Fahrenheit with days when they ranged between 60 and 69 degrees Fahrenheit.
Matthew E. Kahn, an economics and public policy professor at UCLA’s Institute of Environment, called the study “a very strong paper” that could show one strategy for adapting to increasingly frequent bouts of warmer weather. The U.N. Intergovernmental Panel on Climate Change issued a report this year linking the increase in heat waves to human-generated greenhouse gas emissions, predicting the frequency of these events will increase in the coming decades.
“We have to begin to wake up to the new normal,” Kahn said. “Rational people have to learn how to duck and take action so we don’t get rolled by Mother Nature.”
The study’s results could be particularly important for nations such as India, where only a small portion of the population has residential air conditioning. The typical person in India experiences 33 days per year where the temperature rises above 90 degrees Fahrenheit; that could increase by as much as 100 days by the end of the century, according to some climate projections.
Anand Patwardhan, a visiting professor at the School of Public Policy at the University of Maryland in College Park, said he expects home air conditioning to become more common in India, but not as a conscious response to global warming.
“While it is certainly the case that residential air-conditioning helps in reducing mortality due to temperature extremes, the rapid growth of air-conditioning in the past is perhaps more due to rising incomes and increasing affordability of air-conditioning,” he wrote in an e-mail.
The spread of air conditioning has one obvious problem, Greenstone noted, since many of these units will likely be powered by fossil fuels and will therefore increase the world’s carbon output.
“The painful part of that is the solution involves more energy consumption,” he said. “And that is going to exacerbate the problem of increased temperatures.”
Andrew Steer, president of the World Resources Institute, said that although there is no question about “air conditioning growing in leaps and bounds in developing countries with rising temperatures,” policymakers also need to explore “ecological” adaptation strategies that yield environmental benefits instead.
Indian Institute of Technology professor Ambuj Sagar wrote in an e-mail that the world should focus on improving appliance efficiency in the face of warmer weather.
“To me, if there is any policy relevance of this study, it is that the developing countries, in their drive for a comfortable life (which will also help adapt to hotter temperatures) are following the same pathway that their industrialized-country counterparts because they don’t have any other pathway available,” Sagar wrote.
By Justin Gillis
I would guess a few Green readers had the experience, over the holidays, of arguing yet again about global warming with a parent or brother-in-law who thinks it’s all a big hoax. Maybe there’s some undiscovered substance in roast turkey that makes people want to pick fights around the dinner table. 
Fortunately, the M.I.T. climate scientist Kerry Emanuel has provided us with a solution to this problem: an updated edition of “What We Know About Climate Change,” his 2007 book explaining the science of global warming.
I’m happy to report that the new edition of this slender volume is an improvement — perhaps even the single best thing written about climate change for a general audience. It is a little longer than the first edition, 93 pages instead of 85, but it’s still an easy read — most people will get through it in a single sitting.
The new version updates the science to the latest numbers, of course, but it also adds a couple of chapters about the potential solutions to climate change and the bizarre politics that have cropped up around it in recent years.
The book is dead accurate, not only presenting scientifically what we know, but also leveling with readers about what we don’t. It conveys the risks posed by that ignorance. Yet Dr. Emanuel manages to keep the language so taut and simple that nobody is likely to be intimidated by the book or to feel put out at being asked to read it.
The point, he said in an interview, is to give people some ammunition when they encounter the kind of contrarianism about climate change that has become pervasive in the United States.
“Young adults who are disputing this problem with their own parents or an uncle or something — they can hand the book to them and say, ‘Will you at least read this?’ ” Dr. Emanuel said. “One at a time, you might change minds.”
The book is officially scheduled for publication on Tuesday, by M.I.T. Press, but it has long since moved into retail channels and is widely available in hardcover for $11. At Dr. Emanuel’s behest, the publisher set an especially low price, $7.50, for the digital edition.
He does not talk much about this in the book, but for anybody who plans to give it to a political conservative, it might be worth pointing out to them that Dr. Emanuel spent most of his adult life as a registered Republican. He changed his registration to independent recently, but he told me that his convictions have not shifted much — he was driven out of the Republican Party by its embrace of global warming skepticism, among other recent positions.
“I came of age in the 1960s and ’70s,” Dr. Emanuel said. “A lot of what was actually going wrong in the country was because of rigid ideology, and a lot of what I considered rigid ideology was on the left. Now I think it’s the right that’s guilty of that, that’s really gone off on this ideological tangent.”
Conservatives will find a few points in the book that especially resonate. For instance, while Dr. Emanuel assails the irrationality of dismissing an entire branch of science as some kind of elaborate hoax — many Republicans have done lately — as he also takes green groups to task on certain points, including their skepticism about nuclear power.
He sees nuclear energy as one of the few ways to reduce carbon dioxide emissions, which contribute to global warming, on a large scale. And he is doubtful that renewable energy sources like wind and solar power can be ramped up fast enough to meet the challenge.
If Dr. Emanuel has been talking about his politics more lately, so have some of his colleagues, like Richard Alley of Penn State, one of the country’s most notable explainers of climate science, who describes himself as a churchgoing Republican.
These scientists are hoping that their conservative credentials will help open some otherwise closed minds, but their ultimate point is that the science itself has nothing to do with politics — and everything to do with physics.
 
		 
 
 
 
 
 
 
 
 
 
 
