Aging US Power Grid Blacks Out More Than Any Other Developed Nation
The United States endures more blackouts than any other developed nation as the number of U.S. power outages lasting more than an hour have increased steadily for the past decade, according to federal databases at the Department of Energy (DOE) and the North American Electric Reliability Corp. (NERC).
According to federal data, the U.S. electric grid loses power 285 percent more often than in 1984, when the data collection effort on blackouts began. That’s costing American businesses as much as $150 billion per year, the DOE reported, with weather-related disruptions costing the most per event.
“Each one of these [blackouts] costs tens of hundreds of millions, up to billions, of dollars in economic losses per event,” said Massoud Amin, director of the Technological Leadership Institute at the University of Minnesota, who has analyzed U.S. power grid data since it became available in the '80s.
“The root causes" of the increasing number of blackouts are aging infrastructure and a lack of investment and clear policy to modernize the grid. The situation is worsened by gaps in the policies of federal and local commissioners. And now there are new risks to the grid from terrorism and climate change's extreme impacts, Amin said.
Also, demand for electricity has grown 10 percent over the last decade, even though there are more energy-efficient products and buildings than ever. And as Americans rely increasingly on digital devices, summers get hotter (particularly in the southern regions of the U.S.) and seasonal demand for air conditioning grows, the problem is only getting worse.
While customers in Japan lose power for an average 4 minutes per year, customers in the American upper Midwest lose power for an average 92 minutes per year, and customers in the upper Northwest lose power for an average 214 minutes per year, according to Amin’s analysis. Those estimates exclude extreme events like severe storms and fires, though those have been increasing the past two decades.
“We used to have two to five major weather events per year [that knocked out power], from the ‘50s to the ‘80s,” Amin said. “Between 2008 and 2012, major outages caused by weather increased to 70 to 130 outages per year. Weather used to account for about 17 to 21 percent of all root causes. Now, in the last five years, it’s accounting for 68 to 73 percent of all major outages.”
The power grid, which could be considered the largest machine on earth, was built after World War II from designs dating back to Thomas Edison, using technology that primarily dates back to the '60s and '70s. Its 7,000 power plants are connected by power lines that combined total more than 5 million miles, all managed by 3,300 utilities serving 150 million customers, according to industry group Edison Electric Institute. The whole system is valued at $876 billion.
The utility industry has talked for years about updating its infrastructure into a “smart grid,” a makeover that could cost between $338 billion and $476 billion, according to the Electric Power Research Institute. A smart grid would allow utilities to monitor customers’ use of electricity remotely, from a central location, rather than requiring on-site monitoring from gauges at homes and businesses.
In 2009, the American Recovery and Reinvestment Act invested $4.5 billion for electricity delivery and energy reliability modernization efforts—matched by private funding to reach a total of $8 billion— in the electric sector to begin the large task of modernizing America’s aging energy infrastructure to provide more reliable power.
Amin estimates that a smarter grid could reduce costs of outages by about $49 billion per year and reduce carbon emissions by 12 to 18 percent by 2030.
As the electric sector continues deploying smart grid technologies, resiliency and reliability will continue to improve, a U.S. Departement of Energy official told IBTimes.
© Copyright IBTimes 2024. All rights reserved.