The Map and the Territory

What can energy system modeling tell us about our ability to deeply decarbonize our power grid by replacing fossil fuels with wind, solar, and batteries?

by Christian Roselund
June 2020

At the rollout of any plan to deal with climate change, you can expect to see visual representations of wind turbines and solar panels. These technologies have captured the imagination of the world as symbols of our ability to decarbonize.

And they’re more than symbols; much of the reduction in carbon emissions that has happened across the world has been in the electricity sector, with solar and wind playing a leading role. And yet when we talk about deep decarbonization—nearly or entirely eliminating carbon emissions—in the power sector we don’t have many good examples of how this will work in practice.

All of the nations that have gotten close to 100 percent renewable energy in electricity generation have done so mostly or entirely by using hydropower (Costa Rica, Norway, Uruguay) or geothermal power (Iceland). France achieved a low-carbon power system by deploying nuclear power in the 1970s, but recent nuclear projects have been plagued by ballooning costs and project timelines, and sometimes scandals.

And while some small islands in the Pacific are running entirely on solar and batteries, at a national scale Denmark is the only nation to get more than 50 percent of its electricity on an annual basis from wind and solar. The Nordic nation has only achieved that milestone in the last few years and depends heavily on the electricity trade with its neighbors.

So what should we make of a recent study showing that the United States can reach 90 percent carbon-free electricity in 2035, with electricity costs lower than today? Since it has never fully been done before, how do we know we can deeply decarbonize our power systems by replacing fossil fuels with wind, solar, and batteries?

Modeling

To show how an electricity system that has never been built would work, you need sophisticated computer models. This science of electricity system modeling isn’t new; utilities, grid operators, and nations have been using models for decades to predict the future of their systems and to plan resource mixes.

But when planning for the integration of high levels of wind and solar, these take on a new importance and new difficulty. Some of the dimensions of energy modeling are worth mentioning here. The first is demand: electricity demand is not static, and changes from hour to hour (more demand during the day and particularly in the evening), day to day (less on the weekend), and season to season (more during hot and cold spells). And of course, these patterns have important differences from state to state, grid to grid, and region to region.

Output must also be modeled, and this is critical for wind and solar, which can rapidly change output depending on the weather. Combining that with the societal expectation of 24/7 electricity supply brings up the question of how often you model these resources.

“With (wind and solar), you need more data to understand how they behave,” states Dr. Christopher Clack, who serves as the CEO of Vibrant Clean Energy (VCE) and has designed energy models for the US National Oceanic and Atmospheric Administration. “With a gas plant you can turn it up and down at will. If you want to do it well, it requires very high levels of granularity, both spatially and temporally. It means that you need many orders of magnitude more data than you would have done previously.”

Add in transmission capacity, and you might have what you need for modeling whether or not a mix of resources can reliably meet electric demand.

100 Percent Wind, Water, and Solar—and Its Critics

The first report to really draw mass excitement about the idea of moving to very high levels of renewable energy was a study led by Stanford Professor Mark Z. Jacobson, which hit the pages of Scientific American in 2009. This study’s conclusion was that we can reach 100 percent wind, water, and solar (WWS) in the United States by 2050—and that if we had the will, we could technically get there by 2030.

Under the Solutions Project, Jacobson later applied this to the entire world, and state-by-state studies were also done. This complete decarbonization would be achieved through massive overbuilding of wind and solar, and Jacobson alluded to a “World War II” level of mobilization to construct the necessary generation. The first iterations also involved use of massive amounts of thermal storage technologies which are not widely commercialized.

This study generated a lot of controversy, including inside the community of energy modelers. Clack led a team that published a study taking the WWS report to task for not including nuclear power and other technologies, as well as for Jacobson et al.’s modeling approach.

For many years debate has raged, with other studies supporting the assertion that power grids could run on 100 percent renewables, though individual studies inevitably differed in some details from the WWS study.

And as this argument went on, our understanding got more refined. The US Department of Energy’s National Renewable Energy Laboratory (NREL) began publishing a series of studies, starting in 2010 with Western Wind and Solar Integration Study (WWSIS), which modeled 30 percent wind and 5 percent solar. This was followed by a study looking at 30 percent wind in the Eastern Interconnection (Eastern Wind Integration and Transmission Study).

NREL reached a milestone in 2012 with the Renewable Electricity Futures Study (RE Futures), which modeled scenarios featuring up to 90 percent renewable energy across the entire United States. RE Futures used a new tool: the Regional Energy Deployment System (ReEDS), an electricity system capacity planning model which breaks the United States into 135 local areas.

In doing so, it provided a much higher degree of granularity. “RE Futures was the first to be so time- and space-specific,” recalls Bentham Paulos, a clean energy consultant who has worked on reports modeling very high levels of renewable energy. “It matched real-world wind and solar resources with real load, by time and place. The others just said, we need this many wind turbines, put them somewhere.”

A Second Wave

After this series, a number of other studies were published. After identifying insufficient long-distance transmission as a stumbling block to integrating higher levels of renewable energy, NREL continued its work with the Interconnection Seams Study, which was published in 2018. This looked at ways to cost-effectively upgrade the US transmission system, including connecting the nation’s three main grids: the Eastern Interconnection, Western Interconnection, and Electric Reliability Council of Texas (ERCOT) grids.

Also in 2018, a study by a team including climate scientist Ken Caldeira published a paper finding that the United States could meet up to 80 percent of electricity demand with wind, solar, and 12 hours of energy storage or a massive high-voltage DC grid, before the amounts of these resources needed to start to rise sharply.

Geophysical Constraints on the Reliability of Solar and Wind Power in the United States was presented as a means to look at the “fundamental constraints” that geophysics imposes on wind and solar. Instead it is now often cited as evidence that the United States can get to very high levels of these resources.

A third important report in this series was published a year earlier. Flexibility, the Path to Low-Cost, Low-Carbon Grids by Climate Policy Initiative showed how various global geographies could get to very high levels of zero-carbon generation at electricity prices similar to today’s, by maximizing flexibility across the system and utilizing a mix of flexible gas plants and lithium-ion batteries.

The Cost Question

As time went on, with more and more models it became more and more obvious that it was technically possible to reach very high levels of renewable energy and still maintain grid reliability. And more models began to look at not only how these high levels could be reached, but how this could be done in a cost-effective manner.

This has been a moving target. In 2009–2012 when Jacobson and NREL were releasing the first reports on very high levels of renewable energy, the cost of wind and particularly solar was still high. Furthermore, the cost of batteries for grid storage was so high and their deployment so limited that it was hard to take them seriously as a main part of the solution.

Additionally, visibility into future costs was limited. It was clear that costs were coming down, but recent cost data was hard to come by. Part of this had to do with the way that costs had traditionally been modeled, using data that was collected slowly over a period of years. For conventional generation like gas or nuclear power plants, this was not terribly problematic, and at the time models relied on estimates that were usually several years old.

However, this timeline is a problem with the rapid fall in wind and solar costs. Even attempts to forecast costs based on historical data tended to fall short, as costs for wind and solar have fallen faster even than renewable energy proponents and industries have anticipated.

Mark Dyson, a principal in RMI’s Electricity Program, notes that utility solicitations have been important in getting more recent price data. In particular he points to a requirement in Colorado that utilities issue an all-source procurement before issuing a long-term plan. “Xcel’s gotten good at it, and then released their data showing how cheap wind and solar have gotten in the West,” notes Dyson.

There have also been consulting firms, including Lazard and Bloomberg New Energy Finance (BNEF), which have released more recent data on wind and solar costs. However, Dyson argues that these often suffer from a lack of granularity, which is important as the output of wind and solar—and therefore the cost per unit of energy delivered—varies significantly depending on where it is deployed.

According to Dyson, NREL’s Annual Technology Baseline (ATB)—first published in 2015—was a significant development. This provides annual data on wind and solar costs and performance, broken down by region in the United States, and has been widely used in recent reports.

2030, 2035, and 2050

Along with cost, there is another issue that has come up: speed. While many studies look at near or full decarbonization of the power system by 2050, the Intergovernmental Panel on Climate Change (IPCC) has stated that we need to reduce global emissions 50 percent economy-wide by 2030 in order to stay on a 1.5°C path.

In the US electricity sector, we may need much more than a 50 percent reduction. As in many places electricity generation is the only sector that has made much progress, it will likely bear more of the burden. Additionally, more of the pressure will be on developed nations where electric demand has stopped growing. Putting all of this together means that nations like the United States will need to dramatically reduce power sector emissions in the next 10–15 years to stay in line with 1.5°C or even 2°C targets and that changes 30 years from now may be too late.

Fortunately, there have been studies that reflect the urgency of this timeframe. In 2016, a team including Dr. Clack released a study which found that the United States could reduce electricity sector emissions up to 78 percent by 2030, mostly by deploying wind and solar.

Future Cost-Competitive Electricity Systems and Their Impact on US CO2 Emissions was one of the most aggressive paths for decarbonization that had been considered. This study showed that this could be accomplished cost-effectively, with electricity rates roughly the same as when the study was published. In order to achieve these feats, it relied heavily on a build-out of long-distance transmission, but did not model a build-out of battery storage.

The latest on the theme of rapid, deep decarbonization of the US electricity system is a report that the Center for Environmental Public Policy at UC Berkeley and GridLab released this June. The 2035 Report: Plummeting Solar, Wind, and Battery Costs Can Accelerate Our Clean Energy Future shows that the United States can reach 90 percent zero-carbon generation—including wind, solar, nuclear, and hydropower—by 2035 at a cost lower than we pay for power today.

It can’t be stressed enough how much this is an evolution of earlier work, as it literally wasn’t possible before. GridLab and UC Berkeley used not only NREL’s ReEDS, but also its ATB to estimate costs. And one thing that the more recent cost estimates enable is a more significant role for lithium-ion batteries, a technology that was considered too expensive to play a massive role in previous studies.

Also, unlike previous studies it does not rely on extensive build-out of long-distance transmission, instead focusing on locally built renewables, noting that even if resource levels are lower, falling costs mean that they are still cost-effective.

To be clear, there is still $100 billion in new transmission assumed in the study, but this is mostly “spur” lines to connect power plants to interstate transmission. “This is a stark contrast with the Renewable Futures Study in 2012,” states David Wooley, the director of the center for Environmental Public Policy at UC Berkeley. “The low cost of wind and solar have expanded what is possible, dramatically.”

To get this result, the 2035 Report used seven years of hourly data. “ReEDs and Plexos is a very powerful combination because it is very time and space specific,” states Ben Paulos, who served as a co-author of the report.

However, Christopher Clack of VCE has noted that there is more room for improvement, arguing that the detail gained with five-minute instead of hourly modeling would affect the results. “The spatial and temporal coarseness leads to missed opportunities and impacts results (especially geographic/temporal diversity) observed when the result went from 90 percent down to 85 percent from ReEDs to PLEXOS,” notes Clack.

While the study’s authors maintain their confidence that a 90 percent outcome can be achieved with PLEXOS under different constraints, they aren’t refuting that more granularity is possible. “We would love to do some follow-up; it is really a macro-level look,” stated Ben Paulos. However, there is the question of how useful more detail will be.

“Sub-hourly dispatch modeling will be more accurate than hourly, but the additional value is small, especially when done over a large geographic area,” argues Nikit Abhyankar, a scientist at UC Berkeley and one of the modelers of the 2035 report. “It could be important when looking at smaller regions, like transmission networks at the individual line level.”

The End or the Beginning?

Now that a sophisticated computer model has calculated that we can reach 85–90 percent zero-carbon electricity by 2035, at a price tag lower than today’s electricity prices, that must be the end of the story, right?

As advanced as The 2035 Report is, scientists say that it isn’t the end our understanding of reaching very high levels of renewable energy, and there are still unanswered questions. A chief one is that since GridLab and UC Berkeley were focused on 2035, they didn’t look as much at the electrification of heating and transportation.

Ironically, Jacobson’s 2009 WWS study is significant in that it modeled not only the electricity sector, but the electrification of transportation and heating. According to Clack, this can significantly alter the outcome of models.

“In a broad brush, electrification completely changes everything in terms of what you would do in terms of building the grid out,” explains Clack. Chiefly, he notes that: “you need more transmission with electrification,” but also that the timing of demand and thus the resource mix will shift. With electrification of heating, the US grid could move from summer peaking to winter peaking, which means more wind and more need for on-demand generation or creative solutions during low-wind periods in the winter.

And ultimately, there is the question of how we translate these models into the real world. “The ways in which the studies represent this in computer code are necessarily different than the decisions of individuals in control centers who have done them,” notes RMI’s Mark Dyson. “Both are probably getting better.”

Dyson recalls when the local utility staff operating the grid in Colorado were spooked at the prospect of even 20 percent wind and solar. He notes that the same utility staff are now modeling 55 percent wind and solar by the mid-2020s and even more after that.

At this point, for deep decarbonization it may matter less what models we make, and more how quickly we deploy the solutions we have. “In the near term, there are absolutely no technical or cost reasons not to sprint on deployment of wind and solar,” argues Dyson.