News + Media

As the world’s population continues to expand, our natural resources will become increasingly strained. In an effort to find sustainable solutions for the planet’s growing population while minimizing environmental impacts, MIT’s Environmental Research Council (ERC) has put forward a detailed implementation plan to establish a Global Environmental Initiative to complement the MIT Energy Initiative (MITEI).
The interdisciplinary, faculty-led council presented the plan to the MIT community last Thursday in a forum held at the Kirsch Auditorium in the Stata Center. Council members outlined an initiative that would bring together MIT’s “core strengths” across campus to help solve the world’s pressing environmental challenges, from mitigating climate change to curbing contamination and maintaining fresh water supplies.
“It’s impossible to imagine a problem bigger and more compelling, or more suited to the strengths of MIT, than how to drive toward global sustainability,” said MIT President Susan Hockfield in a video address to the forum. “Far too often the public conversation about the environment and climate gets mired in the discourse of blame and despair. Today, I believe MIT has an opportunity, and frankly an obligation, to help replace that stalemate with the momentum of creative, realistic, positive change.”
Once launched, the Global Environment Initiative is expected to focus on cultivating six key areas of academic research throughout MIT: climate, oceans, water, ecological resilience, contamination mitigation and sustainable societies.
Dara Entekhabi, professor of civil and environmental engineering and chair of the ERC, says that while many researchers at MIT are working in the research themes identified in the plan, often these efforts occur in isolation. For example, a biologist studying the health effects of contaminants could give valuable input to chemists designing new materials. Or a mechanical engineer designing a water purification facility may benefit from an urban planner’s perspective. The environmental initiative will aim to identify and bring together such related efforts, foster technological and social innovations in all six environmental research themes, and identify strategic directions for growth.
In the areas of climate and oceans, MIT already has a strong foundation of interdisciplinary collaboration. The Center for Global Change Science, the Joint Program on the Science and Policy of Global Change, the Climate Modeling Initiative, and the recently launched Lorenz Center all focus on understanding the climate system and human contributions to that system. Similarly, in the area of ocean science, MIT has a long history of research and educational collaboration with the Woods Hole Oceanographic Institution (WHOI).
Going forward, the Global Environment Initiative would work to strengthen these existing efforts and identify new research priorities. For example, in the climate arena, the plan proposes increasing work devoted to reducing uncertainty in climate predictions. In the case of oceanic studies, the initiative would boost efforts to harness the potential of new data collection and analysis technologies to monitor the impacts of human activity.
In addition to strengthening existing environmental programs, Entekhabi says the initiative will plant the seeds for new cross-campus collaborations in the areas of water, ecological resilience, contamination mitigation and sustainable societies. Thursday’s forum highlighted work already underway in labs throughout MIT in these key areas.
For example, researchers across multiple departments are tackling various challenges related to water, from engineering portable desalinators and water-purifying membranes to analyzing greenhouse gas emissions from water treatment plants and designing city sidewalks that direct rainwater to green spaces. In the area of ecological resilience, biologists and geneticists are studying the central role microbes play in regulating the global environment. In an effort to mitigate future environmental contamination, chemists and material scientists are investigating ways to create environmentally “benign-by-design” products. And economists, social scientists and urban planners are envisioning ways to make societies more sustainable by examining global food supply chains, designing “green” buildings, and evaluating sustainable transportation and urban designs.
One of the initiative’s first goals, once launched, will be securing funding for graduate and postdoctoral fellowships, as well as ignition grants, to foster innovative, cross-disciplinary research projects that would otherwise struggle to attract initial funding from traditional sources. The Earth Systems Initiative, which Entekhabi currently heads, has had an ignition grant program in place to support new, high-risk projects in earth sciences. This program will likely form the foundation for similar efforts under the new initiative.
The initiative also lays out a plan for creating educational programs. “Simply put,” the plan reads, “incorporating an understanding of the linkages between environmental quality and human welfare must become an essential part of MIT’s basic educational message.” In this spirit, the initiative will host workshops and symposia, and support the development of a new undergraduate minor in environment and sustainability.
In describing the Global Environmental Initiative’s broad goals during last Thursday’s forum, Entekhabi drew a comparison with the history of medicine. He noted that in the last few decades, the use of trial-and-error methods in medicine, such as exploratory surgery and empirical drug discovery, has largely been replaced by advanced medical imaging and targeted drug synthesis.
“What we need to do for the environment is what we’ve done for our health, and our advanced medical practice,” Entekhabi said. “We need to replace trial-and-error with rational design. And that requires understanding fundamentally how the system works, in the same way as understanding fundamentally how human health works.”
The implementation plan for the Global Environmental Initiative is available for public review and comment until Feb. 10, 2012.
This talk will describe the tremendous potential benefits of shale gas and the environmental challenges posed by shale gas production. John Deutch will review the work of the Secretary of Energy Advisory Board Shale Gas Subcommittee, which he chaired, including the recommendations, the reasons for these recommendations, and the lessons to be learned from the experiences of this unusual advisory committee.
MIT report shows that with new policies, U.S. electric grid could handle expected influx of electric cars and wind and solar generation.

Over the next two decades, the U.S. electric grid will face unprecedented technological challenges stemming from the growth of distributed and intermittent new energy sources such as solar and wind power, as well as an expected influx of electric and hybrid vehicles that require frequent recharging. But a new MIT study concludes that — as long as some specific policy changes are made — the grid is most likely up to the challenge.
Study co-director Richard Schmalensee, the Howard W. Johnson Professor of Economics and Management at the MIT Sloan School of Management, says the two-year study came about “because a number of us were hearing two sorts of rhetoric” about the U.S. power grid: that it’s on the brink of widespread failure, or that simply installing some new technology could open up wonderful new opportunities.
“The most important broad finding was that both of these are false,” Schmalensee says. While the grid is not in any imminent danger, he says, “the current regulatory framework, largely established in the 1930s, is mismatched to today’s grid.” Moreover, he adds, today’s regulations are “highly unlikely [to] give us the grid of the future — a grid that by 2030 will support a range of new technologies and consumer services that will be essential for a strong and competitive U.S. economy.”
The report was commissioned by the MIT Energy Initiative (MITEI) and carried out by a panel of 13 faculty members from MIT and one from Harvard University, along with 10 graduate students and an advisory panel of 19 leaders from academia, industry and government.
While the grid’s performance is adequate today, decisions made now will shape that grid over the next 20 years. The MIT report recommends a series of changes in the regulatory environment to facilitate and exploit technological innovation. Among the report’s specific recommended changes: To enable the grid of the future — one capable of handling intermittent renewables — the United States will need effective and enhanced federal authority over decisions on the routing of new interstate transmission lines. This is especially needed, the report says, in cases where power is produced by solar or wind farms located far from where that power is to be used, requiring long-distance transmission lines to be built across multiple regulatory jurisdictions.
“It is a real issue, a chicken-and-egg problem,” says John Kassakian, a professor of electrical engineering at MIT and the study’s other co-chair. “Nobody’s going to build these new renewable energy plants unless they know there will be transmission lines to get the power to load centers. And nobody’s going to build transmission lines unless the difficulty of siting lines across multiple jurisdictions is eased.”
Currently, when new transmission lines cross state boundaries, each state involved — and federal agencies as well, if federal lands are crossed — can make its own decisions about permission for the siting of these lines, with no centralized authority.
“There are many people who can say no, and nobody who can say yes,” Schmalensee explains. “That’s strategically untenable, especially since some of these authorities would have little incentive ever to say yes.”
The MITEI report recommends that the Federal Energy Regulatory Commission (FERC) either be given the authority to make decisions in such cases, or be designated as the “backstop” authority in cases where there are disputes.
The grid would also benefit from a restructuring of the way customers pay for its costs, the study found. Payment for electric distribution, like payment for generation, is currently calculated based on usage. But most of the costs involved are fixed; they don’t depend on usage. This gives utilities incentives to resist distributed generation, such as homeowners installing rooftop solar panels, and gives consumers excessive incentives to install such systems — and thereby to shift their share of fixed network costs to their neighbors. Fixed network costs, the reports says, should be recovered primarily through customer charges that don’t depend on electricity consumption.
In addition, while many utilities have begun to install “smart meters” for their customers, most of these are not yet being used to provide feedback to customers that could shift electricity usage to off-peak hours.
“We haven’t done as much as we could to develop this capability, to learn how to do this,” Schmalensee says. “It could save everybody money, by cutting down the need to build new generators.” While overall growth in demand is expected to be modest and easily accommodated, without new policies peak demand will rise much faster, requiring new generating capacity. “We continue to build capacity that’s only used a few hours a year,” he says. Providing consumers with better price signals and the ability to play a more active role in managing their demand could significantly improve this imbalance, the report says.
Another area that will require restructuring, the study concluded, is cybersecurity: The more thoroughly the grid is interconnected, and the more smart meters are added to gather data about usage patterns, the greater the risk of security breaches or cyberattacks on the system.
At the moment, no agency has responsibility and authority for the entire grid. The report strongly recommends that some agency — perhaps the U.S. Department of Homeland Security — be given such responsibility and authority, but thorny issues related to authority over local distribution systems would need to be resolved. In addition, the report notes, it will be important to develop rules and systems to maintain the privacy of data on customers’ electricity usage.
Requiring the sharing of data, especially data collected as a result of federal investments through the American Recovery and Reinvestment Act of 2009, should be a significant priority, the report says. The government “spent a lot of money on pilot programs and experiments, and installations of a lot of new equipment that can improve the efficiency and reliability of the grid and the management of demand,” Kassakian says. But there needs to be more cooperation and communication about the results of those programs “in order to get the benefits,” he says.
In fact, widespread sharing of data from real-time monitoring of the grid could help prevent some failures before they happen, Kassakian says: “If you’re aware of what’s happening at the same time everywhere, you can observe trends, and see what might be an incipient failure. That’s very useful to know, and allows better control of the system.”
The MITEI study found that growth in the number of electric vehicles (EVs) on the road is likely to be slow enough, and widely distributed enough, that it shouldn’t create significant strain on the grid — although there may be a few locations where a particularly high penetration of such vehicles could require extra generating capacity. Some other effects could be subtle: For example, in some hot regions of the Southwest, grid components such as transformers are designed to cool off overnight when demand is ordinarily low. But a sudden influx of EVs charging at night could necessitate bigger transformers or cooling systems, while charging them at the end of the work day could significantly increase peak demand and thus the need for new capacity.
Utilities now spend very little on research, the study found, because regulators provide little incentive for them to do so. The report recommends that utilities put more money into research and development — both to make effective use of new technologies for monitoring and controlling the grid, and on customer response to pricing policies or incentives.
Panel discussion on the impact of climate change on agriculture and the food we eat.
AP, Seth Borenstein
WASHINGTON (AP) — Heat-trapping greenhouse gases in the atmosphere are building up so high, so fast, that some scientists now think the world can no longer limit global warming to the level world leaders have agreed upon as safe.
New figures from the U.N. weather agency Monday showed that the three biggest greenhouse gases not only reached record levels last year but were increasing at an ever-faster rate, despite efforts by many countries to reduce emissions.
As world leaders meet next week in South Africa to tackle the issue of climate change, several scientists said their projections show it is unlikely the world can hold warming to the target set by leaders just two years ago in Copenhagen.
"The growth rate is increasing every decade," said Jim Butler, director of the U.S. National Oceanic and Atmospheric Administration's Global Monitoring Division. "That's kind of scary."
Scientists can't say exactly what levels of greenhouse gases are safe, but some fear a continued rise in global temperatures will lead to irreversible melting of some of the world's ice sheets and a several-foot rise in sea levels over the centuries — the so-called tipping point.
The findings from the U.N. World Meteorological Organization are consistent with other grim reports issued recently. Earlier this month, figures from the U.S. Department of Energy showed that global carbon dioxide emissions in 2010 jumped by the highest one-year amount ever.
he WMO found that total carbon dioxide levels in 2010 hit 389 parts per million, up from 280 parts per million in 1750, before the start of the Industrial Revolution. Levels increased 1.5 ppm per year in the 1990s and 2.0 per year in the first decade of this century, and are now rising at a rate of 2.3 per year. The top two other greenhouse gases — methane and nitrous oxide — are also soaring.
The U.N. agency cited fossil fuel-burning, loss of forests that absorb CO2 and use of fertilizer as the main culprits.
Since 1990 — a year that international climate negotiators have set as a benchmark for emissions — the total heat-trapping force from all the major greenhouse gases has increased by 29 percent, according to NOAA.
The accelerating rise is happening despite the 1997 Kyoto agreement to cut emissions. Europe, Russia and Japan have about reached their targets under the treaty. But China, the U.S. and India are all increasing emissions. The treaty didn't require emission cuts from China and India because they are developing nations. The U.S. pulled out of the treaty in 2001, the Senate having never ratified it.
While scientists can't agree on what level of warming of the climate is considered dangerous, environmental activists have seized upon 350 parts per million as a target for carbon dioxide levels. The world pushed past that mark more than 20 years ago.
Governments have focused more on projected temperature increases rather than carbon levels. Since the mid-1990s, European governments have set a goal of limiting warming to slightly more than 2 degrees Fahrenheit (1.2 degrees Celsius) above current levels by the end of this century. The goal was part of a nonbinding agreement reached in Copenhagen in 2009 that was signed by the U.S. and other countries.
Temperatures have already risen about 1.4 degrees Fahrenheit (0.8 degrees Celsius) since pre-industrial times.
Massachusetts Institute of Technology professors Ron Prinn, Henry Jacoby and John Sterman said MIT's calculations show the world is unlikely to meet that two-degree goal now.
"There's very, very little chance," Prinn said. "One has to be pessimistic about making that absolute threshold." He added: "Maybe we've waited too long to do anything serious if two degrees is the danger level."
Click here to read the rest of the AP story.
LA Times, Dean Kuipers
Several readers pointed out an omission in last week's post about the National Oceanic and Atmospheric Administration’s release of its Annual Greenhouse Gas Index, which showed that man-made gases that contribute to global warming continued a steady rise. The post -– and the AGGI –- mentioned carbon dioxide, methane, nitrous oxide and other gases, but failed to mention the biggest contributor to global warming: plain old water vapor.
“I want to comment that the way-dominant greenhouse gas in the atmosphere is not mentioned, namely water vapor,” writes Ken Saunders of Pacific Palisades. “Water vapor accounts for about 97 percent of the total (natural plus man-emitted) greenhouse warming of the planet. See, e.g., John Houghton's ‘The Physics of Atmospheres, 3rd edition,’ Cambridge University Press, 2002.”
This is true, water vapor is the major player in the greenhouse effect and is often omitted from reports and reporting about global warming -– mostly because it is more of a symptom than a cause in global climate change, and cannot be easily mitigated.
Tom Boden, director of the U.S. Energy Department’s Carbon Dioxide Information Analysis Center at Oak Ridge National Laboratory, acknowledges in an email: “Folks are right when they state water vapor is a powerful greenhouse gas and not routinely measured directly in the atmosphere. Atmospheric water vapor is difficult to measure, highly reactive, and variable in amount due to meteorological conditions (i.e., atmospheric water vapor is continuously being generated from evaporation and continuously removed by condensation).”
“Water vapor is the most important greenhouse gas and natural levels of [carbon dioxide, methane and nitrous oxide] are also crucial to creating a habitable planet,” writes John Reilly, professor at MIT and co-director of the Joint Program on the Science and Policy of Global Change, Center for Environmental Policy Research, in an email.
That idea leads many to believe that global warming is natural and cannot be affected much by human activity. Reader Roy W. Rising of Valley Village writes: “Today's report focuses on a bundle of gases that comprise a very small part of total of ‘greenhouse’ gases. It totally disregards the long-known fact that about 95% of all ‘greenhouse’ gases is WATER VAPOR! Spending billions of dollars to alter a few components of the 5% won't affect the natural course of climate change.”
Reilly warns, however, that scientists don’t blame water vapor or clouds for global warming.
“Concerns about global warming are about how human beings are altering the radiative balance,” says Reilly. “While some of the things we do change water vapor directly, they are insignificant. Increasing ghg's [greenhouse gases] through warming will increase water vapor and that is a big positive feedback [meaning: the more greenhouse gases, the more water vapor, the higher the temperature]. But the root cause are ghg's. So in talking about what is changing the climate, changes in water vapor are not a root cause.”
Click here to read the rest of the story reported by the LA Times.
There are many sources that can make a contribution to our energy supply, but likely not at a major scale in the near future.

Beyond wind and solar power, a variety of carbon-free sources of energy — notably biofuels, geothermal energy and advanced nuclear power — are seen as possible ways of meeting rising global demand.
But many of these may be difficult to scale up enough to make a major contribution, at least within the next couple of decades. And a full accounting of costs may show that some of these technologies are not realistic contributors toward reducing emissions — at least, not without new technological breakthroughs.
Biofuels have been an especially controversial and complex subject for analysts. Different studies have come to radically different conclusions, ranging from some suggesting the potential for significant reductions in greenhouse gas emissions to others showing a possible net increase in emissions through increased use of biofuels.
For example, a 2009 study from MIT’s Joint Program on the Science and Policy of Global Change found that a major global push to replace fossil fuels with biofuels, advocated by many as a way to counter greenhouse gas emissions and climate change, could actually have the opposite effect. Without strict regulations, that study found, the push to grow plants for biofuels could lead to the clearing of forestland. But forests effectively absorb carbon from the air, so the net effect of such clearing would be an increase in greenhouse gases entering the atmosphere, instead of a decrease.
Another recent MIT study, by researcher James Hileman of MIT’s Department of Aeronautics and Astronautics, found that replacing fossil fuels with biofuels for aviation could have either positive or negative effects — depending on which crops were used as feedstock, where these were located, and how the fuels were processed and transported.
Key to biofuel’s success is the development of some sort of agriculture that wouldn’t take away land otherwise used to grow food crops. There are at least two broad areas being studied: using microbes, perhaps biologically engineered ones, to break down plant material so biofuels can be produced from agricultural waste; or using microscopic organisms such as algae to convert sunlight directly into molecules that can be made into fuel. Both are active areas of research.
For the former, one problem is that traditional processes to break down cellulose use high temperatures. “You really want these conversions to go on at low temperature, otherwise you lose a lot of energy to heat up” the material, says Ron Prinn, the TEPCO Professor of Atmospheric Science and co-director of the MIT Joint Program on the Science and Policy of Global Change. But, he adds: “Given the ingenuity of bioengineers, these conversion problems will be solved.”
Tapping the Earth
Geothermal energy has huge theoretical potential: The Earth continuously puts out some 44 terawatts (trillions of watts) of heat, which is three times humanity’s current energy use.
The most promising technology for tapping geothermal energy for large-scale energy production is so-called hot dry rock technology (also called engineered geothermal), in which deep rock is fractured, and water is pumped down into a deep well, through the fractured rock, then back up an adjacent well after heating up. This heated water can then be used to generate steam to drive a turbine. A 2006 MIT study led by professor emeritus Jefferson Tester, now at Cornell University, found potential to generate 0.5 terawatts of electricity this way in the United States by 2050. And a new study by researchers at Southern Methodist University, released this week, found that just using presently available technology, there is a potential for 3 terawatts of geothermal electricity in the United States.
In principle, this power source could be tapped anywhere on Earth. As you drill deeper, the temperature rises steadily; by going deep enough it’s possible to reach temperatures sufficient to drive generating turbines. Some places have high temperatures much closer to the surface than others, meaning this energy could be harnessed more easily.
Using this method, “there are thousands of years’ worth of energy available,” says Professor of Physics Washington Taylor. “But you have to drill deeply,” which can be expensive using present-day drilling methods, he says.
“There’s a lot of energy there, but we don’t quite have the technology” to harness it cost-effectively, he says. Less-expensive ways of drilling deep into the Earth could help to make geothermal energy cost effective.
Advanced nuclear
Most analysts agree nuclear power provides substantial long-term potential for low-carbon power. But a broad interdisciplinary study published this year by the MIT Energy Initiative concluded that its near-term potential — that is, in the first half of this century — is limited. For the second half of the century, the study concluded, nuclear power’s role could be significant, as new designs prove themselves both technically and economically.
The biggest factors limiting the growth of nuclear power in the near term are financial and regulatory uncertainties, which result in high interest rates for the upfront capital needed for construction. Concerns also abound about nuclear proliferation and the risks of radioactive materials — some of which could be made into nuclear weapons — falling into the hands of terrorists or rogue governments.
And, while nuclear power is often thought of as zero-emissions, Prinn points out that “it has an energy cost — there’s a huge amount of construction with a huge amount of concrete,” which is a significant source of greenhouse gases.
A bewildering variety of other sources of energy have been discussed. Some, such as fusion power — harnessing the process that powers the sun itself — require significant technological breakthroughs, but could conceivably pay dividends in the very long term.
Others have inherent limits that will, for the foreseeable future, make them much smaller contributors to energy production. For example, the power of waves and tides is a potential energy source, with the world’s oceans producing a total of 3.75 terawatts of tidal power. But, practically speaking, the most that could ever be captured for human use is far less than one terawatt, Taylor says.
With any energy source, it’s crucial to examine, in great detail, the total process required to harness their power. “Every one of these has an energy or environmental cost,” Prinn says. “Nevertheless, this should not deter their consideration. It should instead spur the research needed to minimize these costs.”
How far can wind power go toward reducing global carbon emissions from electricity production?

With the world’s energy needs growing rapidly, can zero-carbon energy options be scaled up enough to make a significant difference? How much of a dent can these alternatives make in the world’s total energy usage over the next half-century? As the MIT Energy Initiative approaches its fifth anniversary next month, this five-part series takes a broad view of the likely scalable energy candidates.
Of all the zero-carbon energy sources available, wind power is the only one that’s truly cost-competitive today: A 2006 report by the U.S. Energy Information Administration put the total cost for wind-produced electricity at an average of $55.80 per megawatt-hour, compared to $53.10 for coal, $52.50 for natural gas and $59.30 for nuclear power.
As a result, wind turbines are being deployed rapidly in many parts of the United States and around the world. And because of wind’s proven record and its immediate and widespread availability, it’s an energy source that’s seen as having the potential to grow very rapidly.
“Wind is probably one of the most significant renewable energy sources, simply because the technology is mature,” says Paul Sclavounos, an MIT professor of mechanical engineering and naval architecture. “There is no technological risk.”
Globally, 2 percent of electricity now comes from wind, and in some places the rate is much higher: Denmark, the present world leader, gets more than 19 percent of its electricity from wind, and is aiming to boost that number to 50 percent. Some experts estimate wind power could account for 10 to 20 percent of world electricity generation over the next few decades.
Taking a longer-term view, a widely cited 2005 study by researchers at Stanford University projected that wind, if fully harnessed worldwide, could theoretically meet the world’s present energy needs five times over. And a 2010 study by the National Renewable Energy Laboratory found that the United States could get more than 12 times its current electricity consumption from wind alone.
But impressive as these figures may sound, wind power still has a long way to go before it becomes a significant factor in reducing carbon emissions. The potential is there — with abundant wind available for harvesting both on land and, especially, over the oceans — but harnessing that power efficiently will require enormous investments in manufacturing and installation.
So far, installed wind power has the capacity to generate only about 0.2 terawatts (trillions of watts) of energy worldwide — a number that pales in comparison to an average world demand of 14 terawatts, expected to double by 2050. The World Wind Energy Association now projects global wind-power capacity of 1.9 terawatts by 2020.
But that’s peak capacity, and even in the best locations the wind doesn’t blow all the time. In fact, the world’s wind farms operate at an average capacity factor (the percentage of their maximum power that is actually delivered) somewhere between 20 and 40 percent, depending on their location and the technology.
Some analysts are also concerned that widespread deployment of wind power, with its inherently unpredictable swings in output, could stress power grids, forcing the repeated startup and shutdown of other generators to compensate for wind’s variability. Many of the best wind-harvesting sites are far from the areas that most need the power, necessitating significant investment in delivery infrastructure — but building wind farms closer to population centers is controversial because many people object to their appearance and their sounds.
One potential solution to these problems lies offshore. While many wind installations in Europe have been built within a few miles of shore, in shallow water, there is much greater potential more than 20 miles offshore, where winds blow faster and more reliably. Such sites, while still relatively close to consumers, are generally far enough away to be out of sight.
MIT’s Sclavounos has been working on the design of wind turbines for installation far offshore, using floating platforms based on technology used in offshore oilrigs. Such installations along the Eastern Seaboard of the United States could theoretically provide most of the electricity needed for the eastern half of the country. And a study in California showed that platforms off the coast there could provide more than two-thirds of the state’s electricity.
Such floating platforms will be essential if wind is to become a major contributor to reducing global greenhouse gas emissions, says research engineer Stephen Connors, director of the Analysis Group for Regional Energy Alternatives (AGREA) at the MIT Energy Initiative. Wind energy is “never going to get big if you’re limited to relatively shallow, relatively close [offshore] sites,” he says. “If you’re going to have a large impact, you really need floating structures.”
All of the technology needed to install hundreds of floating wind turbines is well established, both from existing near-shore wind farms and from offshore drilling installations. All that’s needed is to put the pieces together in a way that works economically.
But deciding just how to do so is no trivial matter. Sclavounos and his students have been working to optimize designs, using computer simulations to test different combinations of platforms and mooring systems to see how they stand up to wind and waves — as well as how efficiently they can be assembled, transported and installed. One thing is clear: “It won’t be one design for all sites,” Sclavounos says.
In principle, floating structures should be much more economical than wind farms mounted on the seafloor, as in Europe, which require costly construction and assembly. By contrast, the floating platforms could be fully assembled at an onshore facility, then towed into position and anchored. What’s more, the wind is much steadier far offshore: Whereas a really good land-based site can provide a 35 percent capacity factor, an offshore site can yield 45 percent — greatly improving the cost-effectiveness per unit.
There are also concerns about the effects of adding a large amount of intermittent energy production to the national supply. Ron Prinn, co-director of MIT’s Joint Program on the Science and Policy of Global Change, says, “At large scale, there are issues regarding reliability of renewable but intermittent energy sources like wind that will require adding the costs of backup generation or energy storage.”
Exactly how big is offshore wind power’s potential? Nobody really knows for sure, since there’s insufficient data on the strength and variability of offshore winds. “You need to know where and when it’s windy — hour to hour, day to day, season to season and year to year,” Connors says. While such data has been collected on land, there is much less information for points offshore. “It’s a wholly answerable question, but you can’t do it by just brainstorming.”
And the answers might not be what wind power’s advocates want to hear. Some analysts raise questions about how much difference wind power can make. MIT physicist Robert Jaffe says that wind is “excellent in certain niche locations, but overall it’s too diffuse” — that is, too thinly spread out over the planet — to be the major greenhouse gas-curbing technology. “In the long term, solar is the best option” to be sufficiently scaled up to make a big difference, says Jaffe, the Otto (1939) and Jane Morningstar Professor of Physics.
Connors is confident that wind also has a role to play. “This planet is mostly ocean,” he says, “and it’s pretty windy out there.”