News + Media
The Washington Post speaks to MIT meteorologist Kerry Emanuel, who dissects the climate science behind a recent tropical cyclone.
"
By Angela Fritz | The Washington Post
Late last week, one of the strongest tropical cyclones on record in the South Pacific made a direct hit on the island nation of Vanuatu, leaving more than 20 people dead and massive destruction in its wake.
Tropical Cyclone Pam’s sustained winds of 165 mph and gusts nearing 200 ripped trees from the ground and flattened homes. In the course of a day, Tropical Cyclone Pam intensified from the equivalent of a category 2 hurricane to a category 4, before going on to become just the second category 5 on record to directly hit an island in the South Pacific. At the time, Pam was the strongest of four concurrent cyclones in the western Pacific and Indian oceans.
It was “one of the largest and most intense cyclones” the region has seen, says Greg Holland, a senior scientist at the National Center for Atmospheric Research who has specialized in South Pacific tropical storms. “Taken together I have not seen a storm with higher damage potential in the region,” Holland told The Washington Post, “and this shows in the extensive damage that Vanuatu has suffered.”
Now, as the death toll grows and the people of Vanuatu pick up the pieces of their devastated lives, scientists are pondering what role Earth’s changing climate may have had in the destructive potential of the storm.
In a post on the climate science blog RealClimate, MIT meteorologist Kerry Emanuel dissects the science embodied in the question, coming to the conclusion that “while Pam and Haiyan, as well as other recent tropical cyclone disasters, cannot be uniquely pinned on global warming, they have no doubt been influenced by natural and anthropogenic climate change and they do remind us of our continuing vulnerability to such storms.”
MIT Climate Change Conversation gets underway with brainstorming session on how to catalyze change.
David L. Chandler | MIT News Office
A gathering of MIT students, faculty, staff, and alumni took part Thursday in series of talks, polling questions, and brainstorming sessions aimed at spurring the whole MIT community to engage in the process of making the Institute a world leader, role model, and catalyst for how campuses around the world can work to reduce their carbon footprint and create a more sustainable environment.
The event, billed as “Creating the Roadmap: Envisioning/Reducing MIT's Carbon Footprint,” began with talks outlining the MIT campus’ current energy usage and emissions, and the presentation of plans for new buildings and renovations that could have an impact on energy use. During the talks, participants had a chance to register their responses to questions about both factual information on campus energy use and opinions about priorities for improving things. Then, the group broke into small teams for brainstorming about suggestions on specific measures to reduce campus greenhouse-gas emissions.
“We’re here to engage you all in renewing the campus in a sustainable manner,” said Israel Ruiz, MIT’s executive vice president and treasurer, who initiated MIT’s creation two years ago of an Office of Sustainability. “It’s an issue I care a lot about,” he said, “how we’re actually going to change the world through what we do here.”
Ruiz pointed out that the campus already faces the need to carry out about $2 billion worth of renovations on its existing buildings over the next five to 10 years, but that need also presents a great opportunity for improving the overall energy efficiency of the campus.
In addition, he said, MIT has “a lot of great research where we can use the campus for experiments,” and potentially find innovations that other institutions can emulate.
Introducing the event, Christoph Reinhart, a professor of architecture, said the idea was to “seek broad input and see how MIT can respond” to the challenge of climate change, “and what we can do as a community” to address our own energy usage. “Based on this input, we will write a final report,” he said, and over the next few months all members of the MIT community are encouraged to submit suggestions and comments online, which need not be fully thought out or detailed.
MIT has 171 buildings totaling 12 million square feet of space, he said. Though that’s a minuscule footprint by global standards, how MIT manages its own facilities could have a disproportionate impact, he said: “We see ourselves as a catalyst for change, and there are a lot of people in the world looking at what we do.” If the Institute can find solutions on campus that are scalable and replicable, he said, “we can have an influence.”
Henry “Jake” Jacoby, professor emeritus of management, underscored that point, saying “MIT is really small, but in terms of demonstrating what can be done, it’s really important.” He suggested that one critical step would be to change the way accounting is done so that different departments and labs would explicitly have to account for their energy use in their own budgets, giving them a direct incentive to find more efficient approaches.
It’s essential, Jacoby added, given MIT’s reputation in the world, that “we don’t do things that are symbolic but don’t have a real effect.” It’s important to “not just do it, but do it right.”
In moving toward innovative solutions, one key aspect for a data-driven place like MIT is to improve the ability to collect detailed energy use data at a building-by-building level or better, and that process is already well underway, said Sarah Bylinsky, a program manager in the sustainability office. “We can’t manage what we can’t measure,” she said.
In making MIT into a world-leading example of how to maximize efficiency and minimize its impact on climate, Ruiz said that “the willingness is there, and there’s a full menu of opportunities.” Now, he said, it’s up to all the members of the MIT family to “help us choose what’s most effective for our community.”
In looking for innovative solutions, Bylinsky said, “we have to embolden ourselves. We shouldn’t be afraid to think big, beyond our current capacity, and to do as much as we can.”
Professor of civil and environmental engineering Dara Entekhabi, science team leader of NASA's SMAP satellite, marvels at the project's first snapshot of Earth.
by Kelsey Damrad | MIT Department of Civil and Environmental Engineering
As severe weather hazards continue to afflict parts of the country to historic extremes, Professor Dara Entekhabi of the MIT Department of Civil and Environmental Engineering (CEE) and a team of NASA scientists provide an unprecedented resource to accurately observe moisture levels within the land for more precise prediction of weather and climate.
On March 4, Entekhabi and NASA’s Soil Moisture Active Passive (SMAP) satellite successfully completed the initial test of its science instruments and revealed its very first image of the Earth’s soil moisture. These “first-light” images were composed of tiny slivers of data in a 40-kilometer line scan and revealed details about the Earth’s soil moisture levels.
During this test, Entekhabi explains, the satellite is not spinning as it orbits the Earth pole-to-pole. As a result, it images a narrow footprint on the ground. Later in March, when the satellite begins to spin as it orbits, the ground footprint will not only have higher resolution but it will also cover a 1,000-kilometer-wide swath.
Specific elements outlined in these early images included a defined contrast between land and ocean bodies. This, says Entekhabi, is a test of the geolocation accuracy. Also over Antarctica, the distinction between ocean, ice shelf, and land ice were clearly evident. Over land, the scattering signature of regions with high biomasses palpably distinguished the sensitivity of the instruments with these early data. For example, the Amazon and Congo forests showed high radar echoes.
“This is just a tiny snapshot of SMAP’s capabilities,” Entekhabi says. “What we are seeing, even in a 40-kilometer line scan, is really remarkable.”
Additionally, the images detected evidence from a variety of known weather phenomena on the ground including Cyclone Marcia — a Category 5 tropical cyclone that occurred in Australia on Feb. 20. This cyclone’s footprint was illustrated in SMAP’s image through low brightness temperatures, a result of its high moisture content in the soil from heavy rainfall.
According to Entekhabi, the amount of accurate data received from SMAP in such an early stage is unparalleled by any other satellite mission.
“Where we are today, with just two days of data, is where other missions are after two years,” said Entekhabi. “Two other existing satellites, SMOS from Europe and Aquarius from NASA, took two years to come to the same calibration accuracy that we saw without any calibration at all.”
An Earth-monitoring mission initiated in late 1999, SMAP is designed to provide a global map of the moisture content of topsoil and provide meteorologists with a resource to better predict severe weather hazards such as heavy precipitation, floods, droughts, hurricanes, and wildfires.
The satellite is now being maneuvered into its final science orbit, which will take an estimated two weeks.
Reflector antenna unfurled Feb. 24
NASA mission controllers successfully deployed SMAP’s 6-meter-wide reflector antenna on Feb. 24 — a significant milestone in the estimated three-year satellite expedition. To unfurl it, NASA sent commands to the observatory to fire an onboard pyro that would release the stowed antenna, which ultimately engaged the motors and expanded the umbrella-like antenna.
Both the radiometer and radar were activated for a two-day period, during which Entekhabi and his team downloaded detailed data from the satellite and assessed the overall instrument performance. SMAP's science instruments and the deployed reflector antenna, in a non-spinning configuration, underwent their initial operation to view Earth.
Made of a lightweight mesh material, the reflector is the satellite’s preliminary step in its overall mission to provide global soil moisture maps.
“Both the radar and the radiometer performed absolutely flawlessly and beyond the team’s expectations,” he says. “The satellite was only on for 48 hours, and the team was able to process that very limited data all the way through the data systems. This is a testament to the preparedness of the mission team and the motivation to do it so quickly.”
On March 31, the mission controllers will perform their final science configuration by releasing the clutch that holds the satellite’s antenna in place and allow for the satellite spin as it orbits to start. The satellite will then have the ability to scan approximately 1,000 kilometers (620 miles), as opposed to exclusively visualizing the area directly beneath the spacecraft.
According to Entekhabi, this instrumental global soil moisture data acquired by the satellite will also allow scientists to gain a comprehensive understanding of the interconnected nature of Earth's three major cycles: water, carbon, and energy.
“We want a global perspective on the Earth’s water cycle in order to understand how the environment works as well as some of the applications that touch our every day lives,” said Entekhabi. “We want to bring the technical capability to sense the environment to the same level as medical imaging.” SMAP is an embodiment of the lessons taught behind the CEE doors, he continued.
SMAP’s science operations will commence the beginning of May, and provide a high-resolution map of the globe’s soil moisture every two to three days.
Instrument identifies methane’s origins in mines, deep-sea vents, and cows.
Jennifer Chu | MIT News Office
Methane is a potent greenhouse gas, second only to carbon dioxide in its capacity to trap heat in Earth’s atmosphere for a long time. The gas can originate from lakes and swamps, natural-gas pipelines, deep-sea vents, and livestock. Understanding the sources of methane, and how the gas is formed, could give scientists a better understanding of its role in warming the planet.
Now a research team led by scientists at MIT and including colleagues from the Woods Hole Oceanographic Institution, the University of Toronto, and elsewhere has developed an instrument that can rapidly and precisely analyze samples of environmental methane to determine how the gas was formed.
The approach, called tunable infrared laser direct absorption spectroscopy, detects the ratio of methane isotopes, which can provide a “fingerprint” to differentiate between two common origins: microbial, in which microorganisms, typically living in wetlands or the guts of animals, produce methane as a metabolic byproduct; or thermogenic, in which organic matter, buried deep within the Earth, decays to methane at high temperatures.
The researchers used the technique to analyze methane samples from settings including lakes, swamps, groundwater, deep-sea vents, and the guts of cows, as well as methane generated by microbes in the lab.
“We are interested in the question, ‘Where does methane come from?’” says Shuhei Ono, an assistant professor of geochemistry in MIT’s Department of Earth, Atmospheric and Planetary Sciences. “If we can partition how much is from cows, natural gas, and other sources, we can more reliably strategize what to do about global warming.”
Ono and his colleagues, including first author and graduate student David Wang, publish their results this week in the journal Science.
Locking in on methane’s frequency
Methane is a molecule composed of one carbon atom linked to four hydrogen atoms. Carbon can come as one of two isotopes (carbon-12 or carbon-13); hydrogen can also take two forms, including as deuterium — an isotope of hydrogen with one extra neutron.
The authors looked for a very rare molecule of doubly isotope-substituted methane, known as 13CH3D — a molecule with both an atom of carbon-13 and a deuterium atom. Detecting 13CH3D was crucial, the researchers reasoned, as it may be a signal of the temperature at which methane formed — essential for determining whether methane is microbial or thermogenic in origin.
Last year, Ono and colleagues, working with scientists from Aerodyne Research, built an instrument to detect 13CH3D. The technique uses infrared spectroscopy to detect specific frequencies corresponding to minute motions within molecules of methane; different frequencies correspond to different isotopes. This spectroscopic approach, which is fundamentally different from the classical mass spectrometric methods being developed by others, has the advantage of portability, allowing its potential deployment in field locations.
Methane’s pulse
The team collected samples of methane from settings such as lakes, swamps, natural gas reservoirs, the digestive tracts of cows, and deep ancient groundwater, as well as methane made by microbes in the lab.
The group noticed something surprising and unexpected in some samples. For example, based on the isotope ratios they detected in cow rumen, they calculated that this methane formed at 400 degrees Celsius — impossible, as cow stomachs are typically about 40 C. They observed similar incongruences in samples from lakes and swamps. The isotope ratios, they reasoned, must not be a perfect indicator of temperature.
Instead, Wang and his colleagues identified a relationship between a feature of the bonds linking carbon and hydrogen in methane molecules — a quality they deemed “clumpiness” — and the rate at which methane was produced: The clumpier the bond, the slower the rate of methanogenesis.
“Cow guts produce methane at very high rates — up to 500 liters a day per cow. They’re giant methane fermenters, and they prefer to make less-clumped methane, compared to geologic processes, which happen very slowly,” Wang says. “We’re measuring a degree of clumpiness of the carbon and hydrogen isotopes that helps us get an idea of how fast the methane formed.”
The researchers applied this new interpretation to methane formed by microbes in the lab, and found good agreement between the isotopes detected and the rates at which the gas formed. They then used the technique to analyze methane from Kidd Creek Mine, in Canada — one of the deepest accessible points on Earth — and two sites in California where the Earth’s mantle rock reacts with groundwater. These are sites in which the origins of methane were unclear.
“It’s been a longstanding question how those fluids were developed,” Wang says. “Now we have a baseline that we can use to explore how methane forms in environments on Earth and beyond.”
Robert Hazen, executive director of the Deep Carbon Observatory in Washington, D.C., who was not involved in the research, sees the group’s new detector as “a radical departure from traditional mass spectrometer techniques. This new approach offers significant advantages in terms of it size, expenses, and adaptability to multiple clumped isotope systems. One can even envision a time when a portable unit might be used in field studies. That's an amazing advance.”
This research was funded in part by the National Science Foundation, Shell Oil, the Deep Carbon Observatory, the National Sciences and Engineering Research Council of Canada, and the German Research Foundation.
Speakers at 10th annual MIT Energy Conference see progress, but great need for more research.
David L. Chandler | MIT News Office
At the conclusion of MIT’s 10th annual Energy Conference, panelist Cheryl Martin, director of the U.S. Department of Energy’s ARPA-E research program, declared, “There is no more important issue than energy.â€
Urging students to work to supply sufficient energy for a growing population while reducing emissions, Martin added, “It needs every discipline to be in the game.â€
The student-run event was founded by David Danielson PhD ’08, then an MIT graduate student and now the U.S. assistant secretary of energy and director of the Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy. Danielson, who also founded MIT’s Energy Club — now the Institute’s largest student group — returned to campus as both a keynote speaker and as moderator of the final panel discussion at this year’s conference.
“The last 10 years have been a wild ride in energy,†Danielson said, reflecting on dramatic and unexpected changes including the shale-gas boom, the rise of extreme weather events, and a steep drop in the costs of solar and wind power.
Danielson observed that MIT’s long history of cross-disciplinary research has been especially relevant to energy and combating climate change. “This interdisciplinary culture is a critical element,†he said. “The divisions really blur.â€
Danielson also reflected on how far the Institute has progressed in energy research and education. When he arrived on campus, in 2001, as a student interested in solar power, he was astonished to find only one course in that field, and no energy-related student organization. Now MIT has dozens of researchers working in the solar arena; the Energy Club has more than 5,000 student, faculty, staff, and alumni members, and organizes more than 100 events every year. Danielson also pointed out that five of the club’s past presidents are now working at the DOE.
And MIT research is having a real impact, he added: Among the 17 initial advanced research projects that the DOE funded when Danielson started working there, one was by Donald Sadoway, the John F. Elliott Professor in Materials Science and Engineering, to produce a low-cost liquid battery for utility-scale storage. While this “phenomenal idea†was first seen as risky and speculative, it has now led to the creation of a company that has raised $50 million and will be installing its first grid-scale batteries this year.
Robert Armstrong, the Chevron Professor in Chemical Engineering and director of the MIT Energy Initiative, kicked off the conference by summarizing a dual challenge: On the one hand, he said, the world is expected to need to double its energy supply by 2050, thanks to a combination of population growth and rising standards of living in developing nations. At the same time, there is a need for drastically reduced emissions from present energy-supply systems, which are still mostly based on fossil fuels. “How do you double the output and still reduce carbon emissions?†Armstrong asked.
He suggested five broad areas that could contribute: a major growth in cheap, reliable solar energy; better methods for storing energy; improvements in the adaptability and reliability of the electric grid; increased use of nuclear energy; and the development of affordable and dependable carbon-capture and sequestration systems.
The conference featured addresses by energy executives including Thomas Siebel, founder and CEO of C3 Energy; William Colton, vice president for strategic planning at Exxon Mobil; Ahmad Chatila, president and CEO of SunEdison; Dirk Smit, vice president of exploration technology research and development at Shell Oil; and William von Hoene Jr., chief strategy officer of Exelon Corp.
Seibel, who started C3 Energy after founding and running a large software-services company, said he sees enormous opportunities in applying big-data analytics to the ever-growing data generated by energy-monitoring systems such as smart meters and home-automation devices. “The data rates are staggering,†he said; his company has developed analytical systems that can save utilities $300 per meter, on average, by improving efficiency in power distribution based on real-time usage information.
But Seibel also pointed out the vulnerabilities inherent in the nation’s electric grid, which he called “the largest and most complex machine ever built. … It’s a large problem, and nobody’s going to do anything about it until something breaks.â€
Chatila, who left the semiconductor industry to lead solar-energy company SunEdison, said the big challenges for renewable energy are cost and intermittency. But he also finds cause for optimism in the extraordinary, often-unanticipated breakthroughs in the world of microprocessors. Over the last two decades, he said, the cost of solar cells has dropped from $10 a watt to $0.35, a trend he expects to continue. Within a few years, Chatila said, solar will be the lowest-cost option for new electric production, without ongoing subsidy. In fact, he said, that’s already the case in some places, such as in much of India.
Meanwhile, there are enormous opportunities for savings based on using the energy we produce more efficiently, several speakers said. Harvey Michaels, a research scientist and lecturer on energy efficiency at the MIT Sloan School of Management, said that efficiency upgrades such as better home insulation and more efficient appliances and machinery represent a $25 billion field and an “untapped potential resource.â€
“When energy prices go through the roof, it’s a lot easier to sell,†Michaels said. “But we’re not going to have that for quite a while.â€
ClimateWire article: Plans to clean up China's air may increase emissions of carbon dioxide.
ClimateWire via Scientific American
China's efforts to improve urban air quality are often viewed as a helper for fighting climate change, but a new joint China-U.S. study says otherwise.
The study—carried out by researchers at the Massachusetts Institute of Technology and Tsinghua University in Beijing—was released last week. It shows that China's strategies for cleaning up air do not necessarily lead to carbon dioxide emissions reductions. Sometimes, according to the study, the efforts could actually increase emissions.
The study came as cleaning up air climbed to near the top of China's policy priorities, especially with record air pollution levels in 2013. The smog triggered unprecedented public outcry that motivated Chinese leaders to declare a "war on pollution."
China rolled out its Air Pollution Action Plan, which calls for limiting coal to 65 percent of the primary energy mix and prohibiting any increase in coal use in three major urban regions along the coast. In addition to displacing coal, the plan also promotes the installation of desulfurization, dust-removal equipment and other pollutant treatment technologies in industrial boilers, furnaces and power plants, particularly those close to cities.
"The urgency with which Beijing is tackling air pollution is certainly positive, and these efforts will also have related benefits in curtailing carbon dioxide emissions—to a certain extent," the report said. "But it would be a mistake to view the current initiatives on air pollution, which are primarily aimed at scrubbing coal-related pollutants or reducing coal use, as perfectly aligned with carbon reduction."
That is because once low-cost opportunities to reduce coal are exhausted, the continued displacement of coal from China's energy mix will become more expensive. If the focus remains narrowly on air quality, the researchers say, Chinese power producers will likely stick with end-of-pipe solutions—such as scrubbing pollutants from the exhaust stream of coal power plants—rather than switching to use more renewable energy.
That, in turn, slows down China's green transition in energy structure. Worse yet, according to the researchers, if the pollution-scrubbing technologies are running on coal-generated electricity, the use of them could increase carbon emissions, even as air quality improves.
John Marshall, the Cecil and Ida Green Professor of Oceanography in MIT’s Department of Earth, Atmospheric and Planetary Sciences and the Director of MIT's Climate Modeling Initiative, spoke with the MIT Club of Northern California about the role oceans play in global climate change.
New research from MIT and the Woods Hole Oceanographic Institute reveals a hidden deep-ocean carbon cycle.
Cassie Martin | Oceans at MIT
Understanding how oceans absorb and cycle carbon is crucial to understanding its role in climate change. For approximately 50 years, scientists have known there exists a large pool of dissolved carbon in the deep ocean, but they didn’t know much about it — such as the carbon’s age (how long it’s been in organic form), where it came from, how it got there, and how long it’s been there, or how these factors influence its role in the carbon cycle.
Now, new research from scientists at MIT and Woods Hole Oceanographic Institute (WHOI) provides deeper insights into this reservoir and reveals a dynamic deep-ocean carbon cycle mediated by the microbes that call this dark, cold environment home. The work, published in Proceedings of the National Academy of Sciences, suggests the deep ocean plays a significant role in the global carbon cycle, and has implications for our understanding of climate change, microbial ecology, and carbon sequestration.
For years, scientists thought that carbon of varying ages made up the deep-ocean reservoir and fueled the carbon cycle, but nobody could prove it. “I’ve been trying for over 20 years,” said Daniel Repeta, a senior scientist in marine chemistry and geochemistry at WHOI and co-author of the study. “Back then we didn’t have a good way to go in and pull that carbon apart to see the pieces individually. We would get half-answers that suggested this was happening, but the answer wasn’t clear enough,” he says. With the help of Daniel Rothman, a professor of geophysics in MIT’s Department of Earth, Atmospheric, and Planetary Sciences (EAPS), and Chris Follett, a postdoctoral associate in Mick Follows’ group and formerly of Rothman’s group, Repeta would soon find the answers he was looking for.
Follett thought a next-generation method called a step-wise oxidation test might be able to reveal the age distribution of the carbon pool. The team exposed water samples taken from the Pacific Ocean to ultraviolet radiation, which converts organic carbon to carbon dioxide. The gas was then collected and measured for radiocarbon, which Follett used to estimate the carbon isotopes’ ages and cycling rate. “At a minimum there are two widely separated components — one extremely young and one extremely old, and this young component is fueling the larger flux through the dissolved carbon pool,” he says.
In other words, the youngest source of dissolved organic carbon in the deep ocean originates from the surface, where phytoplankton and other marine life fix carbon from the atmosphere. Eventually these organisms die and sink down the water column, where they dissolve and are consumed by microbes. Because it takes 1,000 years for the ocean’s surface waters to replace bottom waters, scientists thought the few-centuries-old carbon, some of which is anthropogenic, couldn’t possibly contribute to the deep ocean pool. That’s no longer the case.
The researchers found that as particulate organic carbon sinks through the water column and dissolves, some of it is sequestered in the reservoir and respired by microbes. The results suggest a more active carbon cycle in the deep ocean bolstered by bacteria that utilize the reservoir as a food source. “We previously thought of the deep ocean as a lifeless and very slow system,” Repeta says. “But those processes are happening much faster than we thought.” If this microbial pump is in fact more robust, then it gives more credence to the idea of using the mechanism to sequester carbon in the deep ocean — a concept some scientists have been working on in recent years.
While some carbon in the reservoir may cycle faster, older carbon cycles much slower. This is because older sources such as hydrothermal vents, methane seeps, and ocean sediment produce carbon that isn’t easily consumed. However, these sources are often disregarded in analyses of the marine carbon cycle because they are considered too small in magnitude to be significant. But when Follett accounted for them in calculating the reservoir’s turnover, or the time it takes for carbon to completely cycle, what he found was astounding. The turnover time of the older portion of the reservoir is 30,000 years — 30 times longer than it takes for the ocean itself to cycle — which indicates these sources may be relevant. “To find something that is more consistent with the biochemical story was fun and surprising,” says Follett. “A lot of people have proposed these ideas over the years, but they haven’t had the evidence to back them up. It was nice to come in and give them the evidence they needed to support these ideas.”
So what do these findings mean for the climate system? In the short term, not much. But on a longer time-scale, one that spans thousands of years, it could affect projections of the amount of atmospheric and sequestered carbon. “It potentially has a very important influence on climate through its role in sequestration of carbon away from the atmosphere,” says Mick Follows, an EAPS associate professor in the Program of Atmospheres, Oceans, and Climate, who was not involved in the study. “If some radical change occurred that changed the nature of that pool, then it could have an effect on climate through greenhouse gas’ influence on the atmosphere.” Such changes might include, for example, deep-ocean temperature fluctuations affecting microbial activity, or a shifting surface ocean environment that could affect plankton and other organisms from which dissolved organic carbon originates.
“One of the things I’ve taken away from the work is that in a way, they’ve transformed a view of how people are thinking that pool is turning over in the deep ocean and what the sources of that are,” says Follows. “It seems like a very profound change in our understanding of how the system works relative to ingrained perspectives.”
We can expect oil prices to remain low for the for the forseeable future, writes John Reilly in a column for USA Today.
John M. Reilly | Co-director, MIT Joint Program on the Science and Policy of Global Change
Worldwide, people are learning to live with less gas, but that's a habit hard to keep.
The price of oil has fallen nearly 60% since peaking in June, and lately there's been a lot of ink and pixels devoted to the question of whether oil prices will plunge even further or whether they will shoot right back up. An even bigger issue is whether prices will stay at these very low levels.
While I doubt oil prices will fall much more — how much further could they reasonably tumble? Perhaps another $20 or so? — history suggests we can expect prices to remain low for the foreseeable future. What's playing out right now in the oil market is likely the same supply-demand dynamic we've seen over and over: several years of extremely high oil prices followed by decades of low prices. The twin oil shocks of the 1970s, for instance, resulted in 20 to 25 years of low prices.
Of course, things are different today — but not that much different. Over the past six or seven years, oil has been relatively expensive, often trading at over $100 a barrel. During that time, both the supply and demand sides of the equation have responded.
On the supply side, high prices have spurred investment in oil and gas exploration. Even as OPEC (Organization of the Petroleum Exporting Countries) has maintained steady production, the U.S. is experiencing a drilling revival and the shale industry is booming. Oil production is up in other countries, too. Canada has boosted its crude oil production, as have Russia and Libya. With more oil on the market, prices predictably have fallen.
On the demand side, many developed countries — including the U.S. — are using less oil.Policies such as CAFE (Corporate Average Fuel Economy) have helped improve the fuel economy of cars and light trucks. Consumers, meanwhile, have recently demanded higher-mileage cars.
There are sociological forces driving oil prices down, as well. For instance, people are more likely to live in cities (rather than car-critical suburbs) and choosing to walk or bike more.
The upshot is that unless the world changes dramatically, we should expect oil prices to remain low for at least the next 10 years. On the supply side, investments in production will continue to bear fruit; and history suggests the forces on the demand side will play out for another decade or two — or at least until people forget about high prices.
Read the full article here.