NATION Giving computers power over grid



Researchers say they'rea few years away from a self-healing electrical grid.
CHICAGO TRIBUNE
Rather than stringing more electrical lines, researchers at Argonne National Laboratory say a better way to prevent another crippling power outage involves software that predicts problems before they happen.
In a radical approach, they are treating the nation's electrical grid as if it were like the Internet, where electrical demand is anticipated and the flow adjusted before imbalances cascade out of control.
"We have 157,000 miles of transmission lines in the country, and it costs half a million dollars to build a mile of transmission line," said Lefteri Tsoukalas, an engineering professor at Purdue University in West Lafayette, Ind., which leads a team of engineers that developed the software Argonne is testing. "Half a million dollars buys a lot of information technology."
The development of a "smart" electrical grid would use extensive information about where electrical demand originated to predict demand in the future. These predictions would warn the system when trouble may occur -- like a massive blackout -- so it could be averted.
"What we've learned is, this is doable," said Yung Liu, manager of energy technology at Argonne, in Argonne, Ill. "You get pretty good predictions."
Furthermore, as computer chip prices continue to fall, adding intelligence to the power grid is much cheaper than rebuilding it, Tsoukalas said.
How it works
The software Argonne is testing, called TELOS, looks at patterns of past electrical use to learn how to predict future demand. Within a few years, the lab tests could be ready for the real world, according to the engineers.
Argonne has collected 30 months' worth of data of its past electrical consumption that is correlated with hour-by-hour weather information.
A year's worth of data was used to train TELOS to understand Argonne's electrical consumption patterns. Now it is running simulations that predict power consumption, and those predictions are compared with the power demands that actually occurred.
Although the predictions aren't perfect, they have been fairly close to actual use, Liu said.
Using computers in the decision-making process is probably necessary to prevent massive power outages from spreading, he added.
"In a typical control room, you try to prevent cascading problems," Liu said. "But in some cases, you have only seconds to act -- the blink of an eye. I don't think humans can do it."
Local level
Large national power grids have become so complex that no one can monitor or control them adequately, Tsoukalas said. So instead of focusing on the overall grid, local grids would be monitored and demand would be predicted on a smaller scale.
By correcting local problems before they spread, he said, operators can avoid major failures of the overall grid.
"If you see anomalies -- feeders are overloaded, congestion -- they can be corrected," Tsoukalas said. "But often, they go unnoticed until it's too late."
Still, information technology alone won't solve all grid problems, he said. There must be adequate generation and transmission infrastructure in place, but adding computer intelligence to the system would vastly improve reliability and security.
"You could give every electric meter in the country an IP [Internet protocol] address," Tsoukalas said.
Although engineers have long used data to manage the electrical grid, turning the task over to computers would be a fairly radical move, said David Schooley, a senior engineer with Commonwealth Edison in Chicago who worked with the Purdue-led consortium.
"I think it's a fundamentally sound approach," he said. "It's in the lab at this point and has to be developed more to be put into practice. This would be another step in software potentially making decisions, looking out ahead and doing something to alleviate a problem.
"Nothing we have now does that on a large scale."