Utility companies provide electricity without interruption to houses and businesses in the territory they serve. To produce electricity, they tap into a mix of power sources, from fossil fuels to nuclear, with a growing share coming from renewables — mainly solar and wind energy.
“During the day, solar often provides free energy, so there’s less overall need for production from generators,” Brown said, “while this reverses in the evening.”
In addition, unexpected cloud cover or sudden temperature changes may force system operators to quickly reshuffle production, he said, which poses an operational challenge.
“When utilities ramp up production from traditional sources—such as generators fueled by coal or natural gas—they need to consider that each of them has different start-up and operational efficiencies,” Brown said. “Finding an optimal mix is a very complex problem, which also requires looking ahead into how conditions can change.”
Brown said some generators are slow-start—it takes a few hours to get them to full speed, but then they run at relatively low costs. He said fast-start generators—called “peakers”—activate quickly, but cost more to run.
Brown added that nuclear plants often need to stay on, and noted that many utilities also use hydro storage, pumping water to a higher altitude and then letting it fall to generate electricity.
“There might be hundreds or thousands of assets they need to coordinate,” Brown said.
Brown and his coauthor Professor James Smith of the Tuck School of Business at Dartmouth College developed an approach for dealing with this challenge in a paper titled “Unit Commitment without Commitment: A Dynamic Framework for Managing an Integrated Energy System Under Uncertainty.” The paper emerged as part of their work on the project “A Grid that is Risk Aware for Clean Electricity (GRACE)," led by Dalia Patino-Echeverri, a professor at Duke’s Nicholas School of the Environment.
A “unit commitment” problem
Brown said their goal was to improve the current industry practice of committing to an energy mix ahead of time—based on single forecasts of demand and renewable production.
He said utilities typically commit the slow-starting and hydro-storage units in their daily plan, which relies on these point forecasts. Brown said if the utility must make up for a shortfall in supply from renewables or faces unexpectedly high demands due to changes in weather conditions, they ramp up the slow-start units they have committed for the day or use fast-starting “peakers” to fill in the gaps.
“But this can be inefficient,” Brown said. “Sometimes it might be better to turn on a slow-start generator that they had previously committed to keep off, and vice versa.”
A key aspect of their approach is that it explicitly incorporates uncertainty.
“You want to incorporate uncertainty upfront, but it needs to be computationally manageable,” Brown said. “And ideally the plan allows for flexibility, especially in systems with more solar or wind capacity. Current industry practice does not include uncertainty at all—not because they don’t recognize the uncertainty or aren’t sophisticated, but because doing so is incredibly challenging.”
The approach by Brown and Smith assigns each energy source a value that changes depending on the variable conditions. For example, the value of a power plant varies with its operational status, current weather conditions, and the demand for electricity at any point in time, Brown said.
“At the start of each day, we solve a dynamic optimization problem for each unit, factoring in many possible scenarios,” he said. ”Decomposing the problem across units like this greatly simplifies the problem, but this also requires us to design prices to capture the cost of production for the ‘last’ unit of power in the system at each point in time.”
The goal is to minimize costs not just for the next hour but also thinking forward about the implications of each decision, Brown said.
“For example, ideally the utility would avoid using up all the storage to preserve some for later use,” Brown said. “The key is to be flexible. We're not committing to anything upfront but allowing our valuations to guide production as conditions change.”
Testing the method
Brown and Smith tested their framework using recent data from Duke Energy, an industry partner on the GRACE project.
The results, Brown said, show that the approach is more efficient, both in low demand scenarios—for example, when there is unexpectedly high solar production—or in high demand scenarios—like when it was hotter than expected and everyone turns up their air conditioners. In fact, in their experiments, their approach outperformed current practice in every scenario.
“We are not just reducing costs on average,” Brown said. “We are reducing them across the board, regardless of how uncertainties unfold. Also, we found that our approach performed almost as well as a clairvoyant who has perfect foresight about future demands and solar production. That’s the best you could hope for. These findings really surprised us.”
Brown said the experiments found greater improvements compared with current practice as the solar and storage in the system increased.
“Much more solar and storage is exactly where companies like Duke Energy are headed, as they work towards mandated net zero targets over the next 10 to 20 years,” he said.
Brown believes their results show tremendous potential for more efficient operations at utilities.
“Using just a desktop PC, we can solve for our daily plans in a few seconds, and can produce flexible decisions in a fraction of a second, even for a system as large as Duke Energy,” Brown said. “And they don’t need to overhaul their operations. They just need to use a different algorithm to generate their daily plans.”
“This is a low-hanging fruit and has the ability not just to lower costs but also reduce utilities’ greenhouse gas emissions by making operations more efficient,” he said.