CS3 In the News
MIT Climate Change Conversation gets underway with brainstorming session on how to catalyze change.
David L. Chandler | MIT News Office
A gathering of MIT students, faculty, staff, and alumni took part Thursday in series of talks, polling questions, and brainstorming sessions aimed at spurring the whole MIT community to engage in the process of making the Institute a world leader, role model, and catalyst for how campuses around the world can work to reduce their carbon footprint and create a more sustainable environment.
The event, billed as “Creating the Roadmap: Envisioning/Reducing MIT's Carbon Footprint,” began with talks outlining the MIT campus’ current energy usage and emissions, and the presentation of plans for new buildings and renovations that could have an impact on energy use. During the talks, participants had a chance to register their responses to questions about both factual information on campus energy use and opinions about priorities for improving things. Then, the group broke into small teams for brainstorming about suggestions on specific measures to reduce campus greenhouse-gas emissions.
“We’re here to engage you all in renewing the campus in a sustainable manner,” said Israel Ruiz, MIT’s executive vice president and treasurer, who initiated MIT’s creation two years ago of an Office of Sustainability. “It’s an issue I care a lot about,” he said, “how we’re actually going to change the world through what we do here.”
Ruiz pointed out that the campus already faces the need to carry out about $2 billion worth of renovations on its existing buildings over the next five to 10 years, but that need also presents a great opportunity for improving the overall energy efficiency of the campus.
In addition, he said, MIT has “a lot of great research where we can use the campus for experiments,” and potentially find innovations that other institutions can emulate.
Introducing the event, Christoph Reinhart, a professor of architecture, said the idea was to “seek broad input and see how MIT can respond” to the challenge of climate change, “and what we can do as a community” to address our own energy usage. “Based on this input, we will write a final report,” he said, and over the next few months all members of the MIT community are encouraged to submit suggestions and comments online, which need not be fully thought out or detailed.
MIT has 171 buildings totaling 12 million square feet of space, he said. Though that’s a minuscule footprint by global standards, how MIT manages its own facilities could have a disproportionate impact, he said: “We see ourselves as a catalyst for change, and there are a lot of people in the world looking at what we do.” If the Institute can find solutions on campus that are scalable and replicable, he said, “we can have an influence.”
Henry “Jake” Jacoby, professor emeritus of management, underscored that point, saying “MIT is really small, but in terms of demonstrating what can be done, it’s really important.” He suggested that one critical step would be to change the way accounting is done so that different departments and labs would explicitly have to account for their energy use in their own budgets, giving them a direct incentive to find more efficient approaches.
It’s essential, Jacoby added, given MIT’s reputation in the world, that “we don’t do things that are symbolic but don’t have a real effect.” It’s important to “not just do it, but do it right.”
In moving toward innovative solutions, one key aspect for a data-driven place like MIT is to improve the ability to collect detailed energy use data at a building-by-building level or better, and that process is already well underway, said Sarah Bylinsky, a program manager in the sustainability office. “We can’t manage what we can’t measure,” she said.
In making MIT into a world-leading example of how to maximize efficiency and minimize its impact on climate, Ruiz said that “the willingness is there, and there’s a full menu of opportunities.” Now, he said, it’s up to all the members of the MIT family to “help us choose what’s most effective for our community.”
In looking for innovative solutions, Bylinsky said, “we have to embolden ourselves. We shouldn’t be afraid to think big, beyond our current capacity, and to do as much as we can.”
Professor of civil and environmental engineering Dara Entekhabi, science team leader of NASA's SMAP satellite, marvels at the project's first snapshot of Earth.
by Kelsey Damrad | MIT Department of Civil and Environmental Engineering
As severe weather hazards continue to afflict parts of the country to historic extremes, Professor Dara Entekhabi of the MIT Department of Civil and Environmental Engineering (CEE) and a team of NASA scientists provide an unprecedented resource to accurately observe moisture levels within the land for more precise prediction of weather and climate.
On March 4, Entekhabi and NASA’s Soil Moisture Active Passive (SMAP) satellite successfully completed the initial test of its science instruments and revealed its very first image of the Earth’s soil moisture. These “first-light” images were composed of tiny slivers of data in a 40-kilometer line scan and revealed details about the Earth’s soil moisture levels.
During this test, Entekhabi explains, the satellite is not spinning as it orbits the Earth pole-to-pole. As a result, it images a narrow footprint on the ground. Later in March, when the satellite begins to spin as it orbits, the ground footprint will not only have higher resolution but it will also cover a 1,000-kilometer-wide swath.
Specific elements outlined in these early images included a defined contrast between land and ocean bodies. This, says Entekhabi, is a test of the geolocation accuracy. Also over Antarctica, the distinction between ocean, ice shelf, and land ice were clearly evident. Over land, the scattering signature of regions with high biomasses palpably distinguished the sensitivity of the instruments with these early data. For example, the Amazon and Congo forests showed high radar echoes.
“This is just a tiny snapshot of SMAP’s capabilities,” Entekhabi says. “What we are seeing, even in a 40-kilometer line scan, is really remarkable.”
Additionally, the images detected evidence from a variety of known weather phenomena on the ground including Cyclone Marcia — a Category 5 tropical cyclone that occurred in Australia on Feb. 20. This cyclone’s footprint was illustrated in SMAP’s image through low brightness temperatures, a result of its high moisture content in the soil from heavy rainfall.
According to Entekhabi, the amount of accurate data received from SMAP in such an early stage is unparalleled by any other satellite mission.
“Where we are today, with just two days of data, is where other missions are after two years,” said Entekhabi. “Two other existing satellites, SMOS from Europe and Aquarius from NASA, took two years to come to the same calibration accuracy that we saw without any calibration at all.”
An Earth-monitoring mission initiated in late 1999, SMAP is designed to provide a global map of the moisture content of topsoil and provide meteorologists with a resource to better predict severe weather hazards such as heavy precipitation, floods, droughts, hurricanes, and wildfires.
The satellite is now being maneuvered into its final science orbit, which will take an estimated two weeks.
Reflector antenna unfurled Feb. 24
NASA mission controllers successfully deployed SMAP’s 6-meter-wide reflector antenna on Feb. 24 — a significant milestone in the estimated three-year satellite expedition. To unfurl it, NASA sent commands to the observatory to fire an onboard pyro that would release the stowed antenna, which ultimately engaged the motors and expanded the umbrella-like antenna.
Both the radiometer and radar were activated for a two-day period, during which Entekhabi and his team downloaded detailed data from the satellite and assessed the overall instrument performance. SMAP's science instruments and the deployed reflector antenna, in a non-spinning configuration, underwent their initial operation to view Earth.
Made of a lightweight mesh material, the reflector is the satellite’s preliminary step in its overall mission to provide global soil moisture maps.
“Both the radar and the radiometer performed absolutely flawlessly and beyond the team’s expectations,” he says. “The satellite was only on for 48 hours, and the team was able to process that very limited data all the way through the data systems. This is a testament to the preparedness of the mission team and the motivation to do it so quickly.”
On March 31, the mission controllers will perform their final science configuration by releasing the clutch that holds the satellite’s antenna in place and allow for the satellite spin as it orbits to start. The satellite will then have the ability to scan approximately 1,000 kilometers (620 miles), as opposed to exclusively visualizing the area directly beneath the spacecraft.
According to Entekhabi, this instrumental global soil moisture data acquired by the satellite will also allow scientists to gain a comprehensive understanding of the interconnected nature of Earth's three major cycles: water, carbon, and energy.
“We want a global perspective on the Earth’s water cycle in order to understand how the environment works as well as some of the applications that touch our every day lives,” said Entekhabi. “We want to bring the technical capability to sense the environment to the same level as medical imaging.” SMAP is an embodiment of the lessons taught behind the CEE doors, he continued.
SMAP’s science operations will commence the beginning of May, and provide a high-resolution map of the globe’s soil moisture every two to three days.
Instrument identifies methane’s origins in mines, deep-sea vents, and cows.
Jennifer Chu | MIT News Office
Methane is a potent greenhouse gas, second only to carbon dioxide in its capacity to trap heat in Earth’s atmosphere for a long time. The gas can originate from lakes and swamps, natural-gas pipelines, deep-sea vents, and livestock. Understanding the sources of methane, and how the gas is formed, could give scientists a better understanding of its role in warming the planet.
Now a research team led by scientists at MIT and including colleagues from the Woods Hole Oceanographic Institution, the University of Toronto, and elsewhere has developed an instrument that can rapidly and precisely analyze samples of environmental methane to determine how the gas was formed.
The approach, called tunable infrared laser direct absorption spectroscopy, detects the ratio of methane isotopes, which can provide a “fingerprint” to differentiate between two common origins: microbial, in which microorganisms, typically living in wetlands or the guts of animals, produce methane as a metabolic byproduct; or thermogenic, in which organic matter, buried deep within the Earth, decays to methane at high temperatures.
The researchers used the technique to analyze methane samples from settings including lakes, swamps, groundwater, deep-sea vents, and the guts of cows, as well as methane generated by microbes in the lab.
“We are interested in the question, ‘Where does methane come from?’” says Shuhei Ono, an assistant professor of geochemistry in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “If we can partition how much is from cows, natural gas, and other sources, we can more reliably strategize what to do about global warming.”
Ono and his colleagues, including first author and graduate student David Wang, publish their results this week in the journal Science.
Locking in on methane’s frequency
Methane is a molecule composed of one carbon atom linked to four hydrogen atoms. Carbon can come as one of two isotopes (carbon-12 or carbon-13); hydrogen can also take two forms, including as deuterium — an isotope of hydrogen with one extra neutron.
The authors looked for a very rare molecule of doubly isotope-substituted methane, known as 13CH3D — a molecule with both an atom of carbon-13 and a deuterium atom. Detecting 13CH3D was crucial, the researchers reasoned, as it may be a signal of the temperature at which methane formed — essential for determining whether methane is microbial or thermogenic in origin.
Last year, Ono and colleagues, working with scientists from Aerodyne Research, built an instrument to detect 13CH3D. The technique uses infrared spectroscopy to detect specific frequencies corresponding to minute motions within molecules of methane; different frequencies correspond to different isotopes. This spectroscopic approach, which is fundamentally different from the classical mass spectrometric methods being developed by others, has the advantage of portability, allowing its potential deployment in field locations.
Methane’s pulse
The team collected samples of methane from settings such as lakes, swamps, natural gas reservoirs, the digestive tracts of cows, and deep ancient groundwater, as well as methane made by microbes in the lab.
The group noticed something surprising and unexpected in some samples. For example, based on the isotope ratios they detected in cow rumen, they calculated that this methane formed at 400 degrees Celsius — impossible, as cow stomachs are typically about 40 C. They observed similar incongruences in samples from lakes and swamps. The isotope ratios, they reasoned, must not be a perfect indicator of temperature.
Instead, Wang and his colleagues identified a relationship between a feature of the bonds linking carbon and hydrogen in methane molecules — a quality they deemed “clumpiness” — and the rate at which methane was produced: The clumpier the bond, the slower the rate of methanogenesis.
“Cow guts produce methane at very high rates — up to 500 liters a day per cow. They’re giant methane fermenters, and they prefer to make less-clumped methane, compared to geologic processes, which happen very slowly,” Wang says. “We’re measuring a degree of clumpiness of the carbon and hydrogen isotopes that helps us get an idea of how fast the methane formed.”
The researchers applied this new interpretation to methane formed by microbes in the lab, and found good agreement between the isotopes detected and the rates at which the gas formed. They then used the technique to analyze methane from Kidd Creek Mine, in Canada — one of the deepest accessible points on Earth — and two sites in California where the Earth’s mantle rock reacts with groundwater. These are sites in which the origins of methane were unclear.
“It’s been a longstanding question how those fluids were developed,” Wang says. “Now we have a baseline that we can use to explore how methane forms in environments on Earth and beyond.”
Robert Hazen, executive director of the Deep Carbon Observatory in Washington, D.C., who was not involved in the research, sees the group’s new detector as “a radical departure from traditional mass spectrometer techniques. This new approach offers significant advantages in terms of it size, expenses, and adaptability to multiple clumped isotope systems. One can even envision a time when a portable unit might be used in field studies. That's an amazing advance.”
This research was funded in part by the National Science Foundation, Shell Oil, the Deep Carbon Observatory, the National Sciences and Engineering Research Council of Canada, and the German Research Foundation.
Speakers at 10th annual MIT Energy Conference see progress, but great need for more research.
David L. Chandler | MIT News Office
At the conclusion of MIT’s 10th annual Energy Conference, panelist Cheryl Martin, director of the U.S. Department of Energy’s ARPA-E research program, declared, “There is no more important issue than energy.â€
Urging students to work to supply sufficient energy for a growing population while reducing emissions, Martin added, “It needs every discipline to be in the game.â€
The student-run event was founded by David Danielson PhD ’08, then an MIT graduate student and now the U.S. assistant secretary of energy and director of the Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy. Danielson, who also founded MIT’s Energy Club — now the Institute’s largest student group — returned to campus as both a keynote speaker and as moderator of the final panel discussion at this year’s conference.
“The last 10 years have been a wild ride in energy,†Danielson said, reflecting on dramatic and unexpected changes including the shale-gas boom, the rise of extreme weather events, and a steep drop in the costs of solar and wind power.
Danielson observed that MIT’s long history of cross-disciplinary research has been especially relevant to energy and combating climate change. “This interdisciplinary culture is a critical element,†he said. “The divisions really blur.â€
Danielson also reflected on how far the Institute has progressed in energy research and education. When he arrived on campus, in 2001, as a student interested in solar power, he was astonished to find only one course in that field, and no energy-related student organization. Now MIT has dozens of researchers working in the solar arena; the Energy Club has more than 5,000 student, faculty, staff, and alumni members, and organizes more than 100 events every year. Danielson also pointed out that five of the club’s past presidents are now working at the DOE.
And MIT research is having a real impact, he added: Among the 17 initial advanced research projects that the DOE funded when Danielson started working there, one was by Donald Sadoway, the John F. Elliott Professor in Materials Science and Engineering, to produce a low-cost liquid battery for utility-scale storage. While this “phenomenal idea†was first seen as risky and speculative, it has now led to the creation of a company that has raised $50 million and will be installing its first grid-scale batteries this year.
Robert Armstrong, the Chevron Professor in Chemical Engineering and director of the MIT Energy Initiative, kicked off the conference by summarizing a dual challenge: On the one hand, he said, the world is expected to need to double its energy supply by 2050, thanks to a combination of population growth and rising standards of living in developing nations. At the same time, there is a need for drastically reduced emissions from present energy-supply systems, which are still mostly based on fossil fuels. “How do you double the output and still reduce carbon emissions?†Armstrong asked.
He suggested five broad areas that could contribute: a major growth in cheap, reliable solar energy; better methods for storing energy; improvements in the adaptability and reliability of the electric grid; increased use of nuclear energy; and the development of affordable and dependable carbon-capture and sequestration systems.
The conference featured addresses by energy executives including Thomas Siebel, founder and CEO of C3 Energy; William Colton, vice president for strategic planning at Exxon Mobil; Ahmad Chatila, president and CEO of SunEdison; Dirk Smit, vice president of exploration technology research and development at Shell Oil; and William von Hoene Jr., chief strategy officer of Exelon Corp.
Seibel, who started C3 Energy after founding and running a large software-services company, said he sees enormous opportunities in applying big-data analytics to the ever-growing data generated by energy-monitoring systems such as smart meters and home-automation devices. “The data rates are staggering,†he said; his company has developed analytical systems that can save utilities $300 per meter, on average, by improving efficiency in power distribution based on real-time usage information.
But Seibel also pointed out the vulnerabilities inherent in the nation’s electric grid, which he called “the largest and most complex machine ever built. … It’s a large problem, and nobody’s going to do anything about it until something breaks.â€
Chatila, who left the semiconductor industry to lead solar-energy company SunEdison, said the big challenges for renewable energy are cost and intermittency. But he also finds cause for optimism in the extraordinary, often-unanticipated breakthroughs in the world of microprocessors. Over the last two decades, he said, the cost of solar cells has dropped from $10 a watt to $0.35, a trend he expects to continue. Within a few years, Chatila said, solar will be the lowest-cost option for new electric production, without ongoing subsidy. In fact, he said, that’s already the case in some places, such as in much of India.
Meanwhile, there are enormous opportunities for savings based on using the energy we produce more efficiently, several speakers said. Harvey Michaels, a research scientist and lecturer on energy efficiency at the MIT Sloan School of Management, said that efficiency upgrades such as better home insulation and more efficient appliances and machinery represent a $25 billion field and an “untapped potential resource.â€
“When energy prices go through the roof, it’s a lot easier to sell,†Michaels said. “But we’re not going to have that for quite a while.â€
ClimateWire article: Plans to clean up China's air may increase emissions of carbon dioxide.
ClimateWire via Scientific American
China's efforts to improve urban air quality are often viewed as a helper for fighting climate change, but a new joint China-U.S. study says otherwise.
The study—carried out by researchers at the Massachusetts Institute of Technology and Tsinghua University in Beijing—was released last week. It shows that China's strategies for cleaning up air do not necessarily lead to carbon dioxide emissions reductions. Sometimes, according to the study, the efforts could actually increase emissions.
The study came as cleaning up air climbed to near the top of China's policy priorities, especially with record air pollution levels in 2013. The smog triggered unprecedented public outcry that motivated Chinese leaders to declare a "war on pollution."
China rolled out its Air Pollution Action Plan, which calls for limiting coal to 65 percent of the primary energy mix and prohibiting any increase in coal use in three major urban regions along the coast. In addition to displacing coal, the plan also promotes the installation of desulfurization, dust-removal equipment and other pollutant treatment technologies in industrial boilers, furnaces and power plants, particularly those close to cities.
"The urgency with which Beijing is tackling air pollution is certainly positive, and these efforts will also have related benefits in curtailing carbon dioxide emissions—to a certain extent," the report said. "But it would be a mistake to view the current initiatives on air pollution, which are primarily aimed at scrubbing coal-related pollutants or reducing coal use, as perfectly aligned with carbon reduction."
That is because once low-cost opportunities to reduce coal are exhausted, the continued displacement of coal from China's energy mix will become more expensive. If the focus remains narrowly on air quality, the researchers say, Chinese power producers will likely stick with end-of-pipe solutions—such as scrubbing pollutants from the exhaust stream of coal power plants—rather than switching to use more renewable energy.
That, in turn, slows down China's green transition in energy structure. Worse yet, according to the researchers, if the pollution-scrubbing technologies are running on coal-generated electricity, the use of them could increase carbon emissions, even as air quality improves.
New research from MIT and the Woods Hole Oceanographic Institute reveals a hidden deep-ocean carbon cycle.
Cassie Martin | Oceans at MIT
Understanding how oceans absorb and cycle carbon is crucial to understanding its role in climate change. For approximately 50 years, scientists have known there exists a large pool of dissolved carbon in the deep ocean, but they didn’t know much about it — such as the carbon’s age (how long it’s been in organic form), where it came from, how it got there, and how long it’s been there, or how these factors influence its role in the carbon cycle.
Now, new research from scientists at MIT and Woods Hole Oceanographic Institute (WHOI) provides deeper insights into this reservoir and reveals a dynamic deep-ocean carbon cycle mediated by the microbes that call this dark, cold environment home. The work, published in Proceedings of the National Academy of Sciences, suggests the deep ocean plays a significant role in the global carbon cycle, and has implications for our understanding of climate change, microbial ecology, and carbon sequestration.
For years, scientists thought that carbon of varying ages made up the deep-ocean reservoir and fueled the carbon cycle, but nobody could prove it. “I’ve been trying for over 20 years,” said Daniel Repeta, a senior scientist in marine chemistry and geochemistry at WHOI and co-author of the study. “Back then we didn’t have a good way to go in and pull that carbon apart to see the pieces individually. We would get half-answers that suggested this was happening, but the answer wasn’t clear enough,” he says. With the help of Daniel Rothman, a professor of geophysics in MIT’s Department of Earth, Atmospheric, and Planetary Sciences (EAPS), and Chris Follett, a postdoctoral associate in Mick Follows’ group and formerly of Rothman’s group, Repeta would soon find the answers he was looking for.
Follett thought a next-generation method called a step-wise oxidation test might be able to reveal the age distribution of the carbon pool. The team exposed water samples taken from the Pacific Ocean to ultraviolet radiation, which converts organic carbon to carbon dioxide. The gas was then collected and measured for radiocarbon, which Follett used to estimate the carbon isotopes’ ages and cycling rate. “At a minimum there are two widely separated components — one extremely young and one extremely old, and this young component is fueling the larger flux through the dissolved carbon pool,” he says.
In other words, the youngest source of dissolved organic carbon in the deep ocean originates from the surface, where phytoplankton and other marine life fix carbon from the atmosphere. Eventually these organisms die and sink down the water column, where they dissolve and are consumed by microbes. Because it takes 1,000 years for the ocean’s surface waters to replace bottom waters, scientists thought the few-centuries-old carbon, some of which is anthropogenic, couldn’t possibly contribute to the deep ocean pool. That’s no longer the case.
The researchers found that as particulate organic carbon sinks through the water column and dissolves, some of it is sequestered in the reservoir and respired by microbes. The results suggest a more active carbon cycle in the deep ocean bolstered by bacteria that utilize the reservoir as a food source. “We previously thought of the deep ocean as a lifeless and very slow system,” Repeta says. “But those processes are happening much faster than we thought.” If this microbial pump is in fact more robust, then it gives more credence to the idea of using the mechanism to sequester carbon in the deep ocean — a concept some scientists have been working on in recent years.
While some carbon in the reservoir may cycle faster, older carbon cycles much slower. This is because older sources such as hydrothermal vents, methane seeps, and ocean sediment produce carbon that isn’t easily consumed. However, these sources are often disregarded in analyses of the marine carbon cycle because they are considered too small in magnitude to be significant. But when Follett accounted for them in calculating the reservoir’s turnover, or the time it takes for carbon to completely cycle, what he found was astounding. The turnover time of the older portion of the reservoir is 30,000 years — 30 times longer than it takes for the ocean itself to cycle — which indicates these sources may be relevant. “To find something that is more consistent with the biochemical story was fun and surprising,” says Follett. “A lot of people have proposed these ideas over the years, but they haven’t had the evidence to back them up. It was nice to come in and give them the evidence they needed to support these ideas.”
So what do these findings mean for the climate system? In the short term, not much. But on a longer time-scale, one that spans thousands of years, it could affect projections of the amount of atmospheric and sequestered carbon. “It potentially has a very important influence on climate through its role in sequestration of carbon away from the atmosphere,” says Mick Follows, an EAPS associate professor in the Program of Atmospheres, Oceans, and Climate, who was not involved in the study. “If some radical change occurred that changed the nature of that pool, then it could have an effect on climate through greenhouse gas’ influence on the atmosphere.” Such changes might include, for example, deep-ocean temperature fluctuations affecting microbial activity, or a shifting surface ocean environment that could affect plankton and other organisms from which dissolved organic carbon originates.
“One of the things I’ve taken away from the work is that in a way, they’ve transformed a view of how people are thinking that pool is turning over in the deep ocean and what the sources of that are,” says Follows. “It seems like a very profound change in our understanding of how the system works relative to ingrained perspectives.”
We can expect oil prices to remain low for the for the forseeable future, writes John Reilly in a column for USA Today.
John M. Reilly | Co-director, MIT Joint Program on the Science and Policy of Global Change
Worldwide, people are learning to live with less gas, but that's a habit hard to keep.
The price of oil has fallen nearly 60% since peaking in June, and lately there's been a lot of ink and pixels devoted to the question of whether oil prices will plunge even further or whether they will shoot right back up. An even bigger issue is whether prices will stay at these very low levels.
While I doubt oil prices will fall much more — how much further could they reasonably tumble? Perhaps another $20 or so? — history suggests we can expect prices to remain low for the foreseeable future. What's playing out right now in the oil market is likely the same supply-demand dynamic we've seen over and over: several years of extremely high oil prices followed by decades of low prices. The twin oil shocks of the 1970s, for instance, resulted in 20 to 25 years of low prices.
Of course, things are different today — but not that much different. Over the past six or seven years, oil has been relatively expensive, often trading at over $100 a barrel. During that time, both the supply and demand sides of the equation have responded.
On the supply side, high prices have spurred investment in oil and gas exploration. Even as OPEC (Organization of the Petroleum Exporting Countries) has maintained steady production, the U.S. is experiencing a drilling revival and the shale industry is booming. Oil production is up in other countries, too. Canada has boosted its crude oil production, as have Russia and Libya. With more oil on the market, prices predictably have fallen.
On the demand side, many developed countries — including the U.S. — are using less oil.Policies such as CAFE (Corporate Average Fuel Economy) have helped improve the fuel economy of cars and light trucks. Consumers, meanwhile, have recently demanded higher-mileage cars.
There are sociological forces driving oil prices down, as well. For instance, people are more likely to live in cities (rather than car-critical suburbs) and choosing to walk or bike more.
The upshot is that unless the world changes dramatically, we should expect oil prices to remain low for at least the next 10 years. On the supply side, investments in production will continue to bear fruit; and history suggests the forces on the demand side will play out for another decade or two — or at least until people forget about high prices.
Read the full article here.
MIT graduate students brush up on the fundamentals of climate science and policy
![]() |
||
|
|
Photo: Paul Kishimoto |
by Audrey Resutek | MIT Joint Program on the Science and Policy of Global Change
Graduate students from the Joint Program on the Science and Policy of Global Change taught a series of classes in January as part of MIT’s annual Independent Activities Period (IAP) that were designed to bring students and community members up to speed on basic climate science, climate policy, and the state of international climate negotiations.
International climate action
Amanda Giang, a graduate student in the Engineering Systems Division, led a session on January 30 on recent climate negotiations. Climate change is a vexing international problem in part because it is a commons problem—a type of problem which many graduate students may already be familiar with, she said.
A dirty kitchen is an example of a commons problem, said Giang, who has roommates. “We all share the kitchen, so it’s in no one’s best interest to clean the kitchen alone. If I clean the kitchen myself, I have to do all the work while everyone gets the benefit. But if no one cleans the kitchen we all suffer. What we really need is some sort of coordinated collective action, where I take out the trash and my roommate does the dishes.”
Because of this, an international agreement is the best route for action. Giang reviewed the recent history of global climate negotiations, including the UN’s efforts leading up to the next round of talks in Paris this winter, where countries are expected to come to an agreement on post-2020 climate action. Giang also discussed existing greenhouse gas mitigation efforts in the US and China, and the recent emissions deal between the two countries.
Economic measurements
Paul Kishimoto, a graduate student in the Engineering Systems Division, led sessions on January 29 and January 30 on the economics of climate change and climate policy.
Economists measure the effects of climate change as costs, both direct and indirect. As an example, Kishimoto asked the class to consider how statistically warmer weather might affect a runner who goes jogging on the Charles. If the runner goes jogging when it’s too hot and gets heat stroke and has to go to the hospital, it is a cost directly related to climate change. If the runner avoids running and misses out on an activity that they would otherwise do, it’s counted as an indirect, or counterfactual, cost of climate change. Calculating both the costs of climate change and the costs of policies allows researchers to evaluate the effectiveness of policies addressing climate change, he said.
![]() |
|
|
Photo: Amanda Giang. |
Kishimoto also discussed how different types of policies aimed at reducing greenhouse gas emissions work, including measures like carbon taxes and trading plans, regulations, and policies encouraging research and development of new technology.
Climate science measurements
Daniel Gilford and Jareth Holt, graduate students in the Department of Earth, Atmospheric and Planetary Sciences, led a session on January 29 on how climate scientists measure climate change.
Gilford started the class out by explaining the concept of radiative forcing, which is a measure of the net difference between the energy the Earth and atmosphere absorbs from sunlight, and the energy released back into space after a change in the atmospheric composition (such as increasing CO2). A change that traps more heat in the Earth system is a positive radiative forcing and contributes to warming. The primary gas causing increased radiative forcing is CO2, but other gases like methane, nitrous oxide, and ozone also play a role.
Jareth Holt discussed how climate models account for factors that affect radiative forcing. To do this, models have become more complex, Holt said. For example, in the 1990s, climate models underestimated the importance of aerosols in calculating radiative forcing, and had simple representations. Models now have more detailed representations of how aerosols behave in the atmosphere.
On the other hand, there are reasons why researchers might want to simplify models. Modern climate models use supercomputers, he explained, and can take weeks or even months to make one simulation. Simpler models run more quickly, and allow researchers to complete a larger number of simulations, helping to understand the uncertainty in the climate system. As a result, climate modeling requires constant balancing between complexity and computational efficiency.
Climate fundamentals
Daniel Gilford and Jareth Holt led a session on January 26 covering basic climate science, and the history of the discipline. Climate science, Holt said, is the study of variability, patterns, and statistics over time.
![]() |
||
|
|
Photo: Daniel Gilford. |
|
The field can trace its roots back to the 1820s, when Joseph Fourier discovered that the Earth’s atmosphere traps heat. The modern study of climate change got its start in the 1890s when Svante Arrhenius built the first simple model balancing energy in the Earth system. He determined that adding CO2 to the atmosphere traps energy, causing warming, which is a principle still used by climate scientists today.
Gilford and Holt also explained what makes a gas a greenhouse gas. The Earth’s atmosphere is made of mostly nitrogen and oxygen, but those gases absorb almost none of the energy given off by the Earth’s surface. Instead, small amounts of other gases, like water vapor and CO2, trap the most energy. Other gases like methane and nitrous oxide are present in even smaller amounts, but because they strongly absorb energy at different wavelengths than CO2 and water vapor, they can also contribute dramatically to warming.
For the full list of 2015 Global Change IAP Clasess click here.
As one of the ten panels open to the public at the upcoming MIT Energy Club Conference, MIT energy economist Christopher Knittel will explore the future of shale gas with fellow experts in the field.
Francesca McCaffrey | MIT Energy Initiative
Leaders from the energy industry, government, and the scientific community will gather to discuss the world’s most pressing energy challenges at the annual MIT Energy Club Conference, to be held February 27-28 on the MIT campus. Developed and organized entirely by MIT students, the conference is this year celebrating its 10th anniversary.
Christopher Knittel, MIT’s William Barton Rogers Professor of Energy and a Professor of Applied Economics at the Sloan School, gave MITEI a preview of the panel he’ll be moderating on the opening day of the upcoming MIT Energy Conference. Called “Unconventional resources: successes and challenges,” Knittel will be focusing on the future of shale gas development in the U.S. and globally.
Here, Knittel shares some brief advance insight about the future of shale gas development.
Q: Does Shale Gas really provide the largest share of US Natural Gas Production?
Knittel: According to the EIA, 40% of US natural gas production, in 2012, came from shale resources. Over 60% of our natural gas comes from shale basins and what is known as tight reservoirs, which typically uses the same drilling techniques as shale natural gas.
Q: Why has shale gas experienced such a boom in the US?
Knittel: The short answer is that there has been a tremendous amount of technological progress in the ability to extract natural gas (and oil) trapped in shale and tight formations. Geologists have known for years that hydrocarbons were trapped in shale basins, but not until the development of horizontal drilling, along with hydraulic fracturing techniques have energy companies been able to economically recover these hydrocarbons.
Q: In a recent MIT News interview, you discussed how a central tenet of the EPA’s forthcoming Clean Power Plan involves shifting states from coal to natural gas. What role do you see federal regulation playing in the future of shale gas in the US?
Knittel: There are a number of important roles for federal regulation. First, if left alone the market will not lead to enough shifting away from coal to natural gas. This is because a number of "externalities"—social costs associated with burning fossil fuels that are unpaid by consumers and firms in these markets—that exist in fossil fuel markets. This is the role that policy makers must take in these markets. Ideally, we would have a set of pollution taxes, not just a carbon tax, but also taxes on particulate matter, mercury, etc. Because these types of policies tend to be politically infeasible, politics drive us to policies like the Clean Power Plan.
Q: Will the advent of shale gas have an impact on the development of alternative energy sources?
Knittel: Anything that lowers the prices of fossil fuels will slow down the development of alternative energy sources. This is true not just for natural gas, but also for oil. Lower natural gas prices make solar and wind technologies more expensive on a relative basis. Similarly, the drop in oil prices will make it more difficult for alternative fuel vehicles to compete in the market place. In the more long term, these lower fossil fuel prices will reduce R&D into these alternative technologies.
Q: What are some of the key political and environmental issues that shale gas producers face?
Knittel: In the US - Any drilling activity has environmental risks. Hydraulic fracturing is no different. In addition, the added step of pumping millions of gallons of water down into the well creates a new set of environmental risks. Furthermore, natural gas leaks, known as fugitive emissions, are a much more potent greenhouse gas compared to carbon dioxide. It is important for the Federal EPA and state-level EPA to assure that best business practices are used in drilling for natural gas and that these practices, as well as fugitive emissions, are adequately monitored.
Q: Do you expect shale gas to truly be a “bridge” fuel, used only as a temporary solution while renewable energy technologies improve, or do you think that shale gas will hold a lasting spot in our energy ecosystem?
Knittel: This will depend on policy. Policy makers must create a set of incentives that will move markets away from natural gas and into renewable technologies. Absent these, natural gas may push coal out of the market and remain the main fuel source in electricity markets.
Professor Knittel will continue the discussion on shale gas development with panelists Paul Sheng, Director at McKinsey & Co, Jan Erik Johansson, Principal Consultant at TCS, and Helen Currie, Senior Economist at ConocoPhillips, on Friday, February 27 at 2pm. Afternoon lectures on Friday will be held at the Marriott.
To attend Professor Knittel’s panel, or to view the rest of the conference agenda and reserve your ticket, visit the MIT Energy Club Conference website by clicking here.
MIT Prof. Paul O'Gorman talks with the Boston Globe about how climate change could affect snowfall.
By Carolyn Y. Johnson | Boston Globe
When a historic blizzard dumps a record-breaking amount of snow on the region, it’s only a matter of time before someone ventures a wry joke about climate change. Maybe there’s an upside to a warmer world, after all? Less shoveling.
But the halfhearted punchline doesn’t hold up to scientific scrutiny, according to recent research from a Massachusetts Institute of Technology atmospheric scientist. In fact, a warming world could mean less overall snow in a given year, but no reprieve from extreme snow events, at least in places like Boston.
To science, not all snowstorms are the same: average snowfall is likely to decrease in most places, but the most aggravating, traffic-snarling, work-stopping, back-straining extreme storms like the one that just buried Boston could actually get bigger.
“Most studies have been about how much snow falls in a season or in a year and call that average snowfall. But of course, in terms of disruption to society or economic disruption, we’re also interested in heavy snowfalls,” said Paul O’Gorman, an associate professor of atmospheric science at MIT who published his findings in Nature. “In some regions, fairly cold regions, you could have a decrease in the average snowfall in a year, but actually an intensification of the snowfall extremes.”
O’Gorman published his findings last August, back when snow was far from the front of mind. He is currently in Australia, where the weather is sunshine and showers instead of snow, but took the time to answer a few questions by email about his counterintuitive finding.
Q: Can you explain how a warming climate might affect snowfall?
A: There are two competing effects as the climate warms: the increasing temperature causes a changeover from snow to rain, but it also increases the amount of water vapor in the atmosphere. For a particular place and time of year, which effect wins out depends on the temperature to begin with.
Alumnus and prominent conservationist Larry Linden calls for carbon tax to combat global warming."
By David L. Chandler
Photo credit: Bryce Vickmark
MIT News Office
After a career that included work as a White House advisor in the Carter administration and as a partner at Goldman Sachs, Larry Linden SM ’70, PhD ’76 has turned his attention to what he says is the most critical issue facing humanity today: the threat of catastrophic global climate change.
Linden, speaking on campus Wednesday in the opening event of the MIT Climate Change Conversation, urged his audience to join him in making the issue a top priority — and in pushing elected leaders to take concrete action now, before changes to the world’s atmosphere and oceans become irreversibly damaging. And the most effective approach, he emphasized, is by putting a price on carbon emissions from fossil fuels.
That could take a number of forms: an outright tax on carbon, a cap-and-trade arrangement, or a revenue-neutral combination of fees and rebates. While the present political climate in the United States may make any such agreement an uphill battle, Linden stressed that his foundation — the Linden Trust for Conservation — and other groups are working hard to find centrist, bipartisan approaches that could lead toward the goal of limiting global climate change.
Pointing to unexpectedly rapid changes in public opinion in other areas, Linden said that while the prospects for political action on climate may now appear bleak, “We can be surprised, and I hope we will. This is an idea that could go from impossible to inevitable overnight.”
Linden, a board member of the World Wildlife Fund and former chairman of the board of directors of Resources for the Future, described the evolution of his thinking on climate change. He said his awareness of environmental issues started early: As a child in Pasadena, Calif., he personally experienced the terrible, pre-Clean Air Act smog of the Los Angeles basin. Now, he said, “I’m doing everything in my power to move our country to act [on climate change] at the scale that’s required.”
Unpredictable changes
At the current pace, Linden said, we face a rise in temperature of as much as 4 to 6 degrees Celsius by 2100 — and the carbon dioxide humans are now adding to the air will stay there for centuries or even millennia. This could lead, he said, “to abrupt, unpredictable and potentially irreversible changes” — such as the release of frozen methane, or the death of the Amazon rainforest — that could greatly amplify the impacts. If such large changes do occur, Linden said, “I don’t think it’s an overstatement to call this a planetary catastrophe.”
While Linden said the fossil fuel industry will fiercely resist any proposed regulations or fees aimed at limiting carbon emissions, he added that past experience gives reason to treat industry’s claims with some skepticism: Proposals to limit automobile pollution to deal with California’s smog problems faced similar objections, which proved unfounded.
“The financial system is an extremely complex, interrelated system — just like the climate system,” Linden said. While companies never like to be told what to do, he said, if federal rules constrain their actions, they will abide by the law.
In 2009, when federal cap-and-trade legislation was proposed in Congress and passed in the House, Linden was encouraged, thinking that this would be a be “a great start.” But when the proposed law was dropped without even being brought to a vote in the Senate, “I practically fell off my chair,” he said. In the years since, the idea has not resurfaced in Congress.
Slashing emissions
But something along those lines is exactly what is needed now, Linden said. Research has shown that to avert the most drastic climate consequences, greenhouse gas emissions must be cut by about 80 percent over the next four or five decades. In addition, massive new investment in research and development on alternative energy sources is needed, he said.
The most essential change in policy, Linden said, is a mechanism for “internalizing the externalities”: capturing the great societal costs of fossil fuel emissions in the costs of the fuels themselves. “It could be a cap-and-trade system, or a tax or a fee on carbon, or a limit on emissions,” he said.
Linden said that his foundation is supporting a revenue-neutral carbon tax as the next major national policy step. Studies have shown, he said, that an initial tax of $15 per ton of carbon emitted, with significant annual increases, could cut emissions in half by 2050.
The key, he said, lies in public action to force politicians to act. But the very nature of the issue — requiring strong action now to avoid consequences many years hence — makes action difficult: “If you designed a problem to maximize the political difficulty of addressing it, you couldn’t do much better,” Linden said. “We will therefore need extraordinary political leadership.”
“Bipartisan support is needed,” he added, noting that the Linden Trust is actively working to build support across the political spectrum. “We’re looking for centrist solutions,” he said. Like the climate itself, “our political system is an extremely complicated, interrelated system. There are possible positive feedback loops, and many nonlinear thresholds that can produce irreversible impacts. … That’s what we need to look for now.”
While the process might take years, Linden said, “We have to fight the political fight.” In the meantime, he said, state initiatives might provide useful “working examples” for national action.
Maria Zuber, MIT’s vice president for research, introduced this kickoff event of MIT's Climate Change Conversation, an effort to involve the MIT community in seeking ways in which the Institute might contribute to addressing the threat of global climate change. The proposals arising from this process, Zuber said, would be “rooted in science, would be bold, would encourage personal engagement.”
Zuber added that while this has been “a divisive issue on many campuses, of the many things that we could possibly do, there must be some things that we can all agree on that would be useful to do, and maybe we should start with doing those.”
Linden's talk was sponsored by the Committee on the MIT Climate Change Conversation, which will organize a series of events during the spring semester to engage the community in thinking about how the Institute can contribute to confronting climate change.
MIT professor is lead scientist on three-year mission to study how soil, water, and carbon interact.
David L. Chandler | MIT News Office
Dara Entekhabi, an MIT professor of civil and environmental engineering and of earth, atmospheric and planetary sciences, is the science team leader of NASA’s Soil Moisture Active Passive (SMAP) satellite, scheduled to be launched from Vandenberg Air Force Base in California on Jan. 29. The satellite will provide measurements of the moisture in the top 2 inches of the soil, everywhere on Earth, over the course of its planned three-year mission, as well as specifying whether that water is liquid or frozen. Entekhabi discussed what he hopes this mission will be able to accomplish.
Q. How much of an improvement will SMAP represent over current ways of assessing soil moisture around the world? Why is it important to be able to do so?
A. Why we need soil moisture information, and what capability SMAP adds, can be explained by following a timeline of what we know about how the Earth system works, starting in the 1980s and 1990s.
Until then, the study of the water cycle — the storage and flow of water in the environment — was partitioned between meteorology and hydrology. So long as water was in the form of vapor and precipitating clouds, it was in the domain of meteorologists. Only after the precipitation hit the surface was its infiltration into soil, runoff, and stream flow in the domain of hydrologists.
Around the 1980s there was a transformative change in our thinking, with the emerging capability of fast computers and Earth system modeling. Much of that thinking, in fact, was formed at MIT: We started thinking about two-way interaction and coupling between the land and the atmospheric branches of the water cycle. In order to couple the systems, we soon realized that the key variable to track is surface soil moisture. But the ground networks to observe this variable were too few and far between to yield any meaningful insights. To make global and dynamic maps of this, we had to take on the vantage point of Earth-orbiting satellites. Starting in 2000, I became involved with NASA, and we formed a team to propose a satellite mission whose design is specifically optimized to make high-resolution and high-accuracy maps of surface soil moisture. The paired microwave-radar and radiometer instruments can sense through clouds and vegetation. The rotation of the antenna while orbiting the Earth produces a wide swath of surface measurements that can track changes in soil moisture.
Q. What would you expect to be some of the most significant findings that this mission will be able to make, and what kind of impact could these findings have?
A. With these high-quality measurements, we will have unprecedented insight into how the cycling of water weaves through its land and atmospheric branches. Because the evaporation and condensation of water also entails exchanges of energy, and because the uptake of atmospheric carbon dioxide by plants requires thawed conditions and exchange of water, soil moisture also links the three fundamental cycles of the Earth system — the water, energy, and carbon cycles — over land. These three cycles work together like gears in a clock: Perturbations in one gear will affect the others. If soil moisture were fixed everywhere, these three cycles would vary independently of one another. But with dynamic and responsive soil moisture, the three are linked over land, with synchronized variations. That is a big difference for the Earth system and its cycles. So with observations of soil moisture and improvement in links between the water, energy, and carbon cycles, our understanding of how the Earth system works will be on a new and higher level. With improved characterization and modeling, the predictions of the global and the regional environment — from short-term weather forecasts to global climate-change projections — will be impacted.
Q. How difficult is it to measure these conditions from orbit, and what kinds of tests will need to assess the accuracy of the data returned by the mission?
A. The main technological challenge has been the design of the large antenna and its rotation to make a scan of a wide swath over each orbit. Given the long microwave wavelengths (about 21 centimeters), to focus on a high-resolution spot on the surface the antenna has to be large in diameter. SMAP has a 6-meter lightweight mesh reflector that stows like an umbrella and unfurls in space. Then this large structure has to be rotating at about 15 rounds per minute to map the surface. This is pushing the technology to its limits. We have developed a calibration and validation plan where we pull data in real time from ground sensor networks, perform quality control, and compare with the SMAP data. Over the last two summers we have had two rehearsals of the end-to-end system. The calibration and validation will also include two airborne field campaigns during the summers of 2015 and 2016. Preliminary science data will be released after six months, and after one year we need to demonstrate, using ground-truth observations, that the data meet the accuracy requirements agreed upon by the project and NASA sponsors.


