Ecmweb 2638 406ecm05pic1
Ecmweb 2638 406ecm05pic1
Ecmweb 2638 406ecm05pic1
Ecmweb 2638 406ecm05pic1
Ecmweb 2638 406ecm05pic1

Maintaining Facility Power in the Age of the Blackout

June 1, 2004
Three years ago California endured a summer of rolling blackouts that cost businesses millions of dollars in lost production and raised concerns about the rest of the nation's power grid. Despite warnings about the poor condition of the power grid from those in the highest offices in the nation, little was said about how or when the problem would be remedied. In a bid to urge new standards for the

Three years ago California endured a summer of rolling blackouts that cost businesses millions of dollars in lost production and raised concerns about the rest of the nation's power grid. Despite warnings about the poor condition of the power grid from those in the highest offices in the nation, little was said about how or when the problem would be remedied.

In a bid to urge new standards for the reliability of the electric power system, former Energy Secretary Bill Richardson said, “In my view, we're the world's greatest superpower, but we have a Third World electricity grid. We have antiquated transmission lines. We have an overloaded system that has not had any new investments. And we don't have mandatory reliability standards on utilities.”

Even the current Secretary of Energy, Spencer Abraham, has recognized that it likely will be a bumpy ride, stating that “Our nation's transmission system over the next decade will fall short of the reliability standards our economy requires and will result in additional bottlenecks and higher costs to consumers. It is essential that we begin immediately to implement the improvements that are needed to ensure continued growth and prosperity.”

Recent analysis of the situation has also shown a human element is involved. Skyrocketing electrical prices that have tripled in some cases were, in part, caused by market manipulation by energy traders who intentionally orchestrated the congestion of transmission lines and inflated wholesale energy costs. The final report issued by the Department of Energy on the August 14 blackout also found “strikingly similar” issues between it and other previous blackouts, to which a major contributor was poor communication between different entities on the power grid.

Recent outages that have occured in North America and around the globe (Sidebar “Going Global: Recent Blackouts” on page 30) have imparted a new sense of urgency about the reliability of electrical power.

Reasons for concern. Even though the nation's power grid is said to be 99.9% reliable, the typical U.S. business will endure about 14 power quality “events” per year. “Event” is defined as any deviation in power quality that may affect the performance of a customer's load — not just outages. Not only that, the average residential customer will experience 90 minutes of outages per year, most of which will be distribution-related.

The power grid isn't the only problem. The demand for electricity continues to grow at roughly 2% to 3% annually. This is a scary trend for a saturated grid that now uses 3.6 trillion kWh of power. Certain regions like Manhattan, N.Y., for example, are experiencing staggering increases in power demand, up by as much as 25% versus a few years ago.

This jump is largely attributed to an increase in the installation of digital/IT equipment and the power necessary to keep it cool, which can often rival the power of the equipment itself, resulting in a double hit to the grid. Since 1980, as much as 90% of the growth in energy demand can be attributed to supplying power to digital (power-sensitive) loads and 9% of the nation's current energy is estimated to be used to power digital devices.

Environmental regulations, coupled with other obstacles that stand in the way of expanding the nation's generating capacity, make powering the new digital economy a challenge.

While utilities define an outage as a power interruption of five minutes or more (Fig. 1), most electronic devices and IT equipment are only designed to sustain interruptions of about eight milliseconds or less (Fig. 2 on page 31). Therefore, any business that intends to run without interruption must have some sort of power conditioning/backup power source in place. Today, it's estimated that 3% to 5% of the grid is fortified with an uninterruptible power supply (UPS) along with 80 gigawatts (GW) of off-grid generating capacity (roughly 10% of the grid's capacity) installed for longer duration backup. And much of the infrastructure aimed at guarding against interruptions is dedicated to short-duration outages, not against longer outages like those experienced in recent blackouts.

The future of backup power. New technologies are on the horizon that promise everything from independence from the grid to smart ways to ride through long outages. Some hold promise while others seem to follow a never-ending horizon.

One of the most practical solutions isn't a breakthrough technology at all, but rather a smart practice called distributed generation. It consists of locating many small generating plants very close to demand areas. Typically, these generators are fired up to satisfy peak demand periods when the grid is most likely to be compromised.

Industry experts support this concept. In a recent Cap Gemini survey, 75% of the respondents said that distributed generation would have a positive effect in the next three to five years, “mitigating congestion costs and addressing system upgrade requirements for transmission.”

Natural gas turbines, particularly high-efficiency co-generation systems, are expected to fill much of the void. Featuring a relatively low profile and low emissions, the co-generation plants don't require as much lead time and regulatory red tape associated with larger generating stations. However, that's not to say that a few nuclear power stations aren't on the horizon.

In many cases, energy consumers elect to use co-generation systems as a cost effective and reliable source of prime power backed up by the utility. California currently leads the nation with an installed base of 6,500MW of co-generation power plants (12% of the states total generation), which is soon expected to double.

Microturbines also hold promise in distributed generation applications. These smaller cousins of the full-size turbine generator generate as many as a few megawatts in parallel systems and can be located very close to the actual load or on-site at the customer's facility (Photo on page 26).

Today's microturbines are quiet, small (a 30kW turbine is roughly the size of a refrigerator), operate on most fuels, are practically pollution-free, and require only a fraction of the maintenance of a large diesel engine. One day, microturbines may be commonplace in neighborhoods for peak shaving, or in some cases prime power for commercial customers who may not want to contend with outages associated with a utility network.

Fuel cell technology may also hold promise. It's already commercially available, and some utilities are looking at equipping residences with their own fuel cell generator that operates on a natural gas source. The advent of vehicle fuel cells is expected to drive down costs for the technology in general and improve the widespread commercial viability of fuel cells as a distributed generation source.

High-power fuel cell plants have been in operation for years, but until a hydrogen infrastructure is established and the cost falls to a reasonable level, they may not be the solution to the grid problems in the near future.

The real improvements will likely not culminate from a single solution, but rather from a combination of many needed improvements. These range from increased generating capacity, massive re-investment in the aging distribution infrastructure, better demand monitoring and evasive plans of action, and a strong base of distributed generation systems.

The rolling blackouts of three summers ago may have brought attention to the country's failing transmission and distribution system, but last summer's blackout on the East Coast brought the problem into much sharper focus. Facilities managers who are concerned about the reliability of their power can't do much to improve quality on the utility side of the meter, but they can take proactive steps to protect their systems with technologies like distributed generation and fuel cell technology until the day when the grid is more reliable.

Katz is senior product manager at MGE UPS Systems in Costa Mesa, Calif.

Sept. 23, 2002: A massive power failure disrupts central Chile. The official reason is said to be “faulty programming” and a “technical failure” at a power station.

Nov. 24, 2002: Buenos Aires and La Plata, Argentina, are hit by a huge power failure.

Jan. 31, 2003: An unusual power failure hits Cambridge, Ontario, Canada.

April 29, 2003: A power failure hits the airport in Melbourne, Australia, disrupting operations for 90 minutes.

Aug. 6, 2003: Buenos Aires is hit again by another sudden blackout.

Aug. 14, 2003: The first major blackout occurs in the northeastern United States and Canada, affecting some 50 million people. Blame centers on a single power generation plant owned by Ohio's FirstEnergy Corp.

Aug. 23, 2003: Finland's capital of Helsinki and its suburbs, including the international airport at Vantaa, are blacked out, affecting more than a half million people in a country known for its first-rate electrical grid.

Aug. 28, 2003: At the height of London's evening rush hour, a massive power outage strikes the city and southeast England. The blackout is blamed on a “fault” in the national electrical grid. The event that Britain's Network Rail calls “unprecedented” stops 18,000 trains, including 60% of the London Underground.

Sept. 1, 2003: Malaysia's capital of Kuala Lumpur and five Malaysian states are struck by a massive blackout. Workers in the Petronas Towers, the world's tallest buildings, are trapped in elevators. With signal lights out, traffic in downtown Kuala Lumpur grinds to a virtual halt. What makes the event all the more perplexing is that blackouts are very rare in the country.

Sept. 2, 2003: Cancun, Mexico, plunges into a blackout. Power is out for six hours, affecting three million people.

Sept. 23, 2003: Eastern Denmark and southern Sweden, including the cities of Copenhagen and Malmo, lose power in what is described as a “very unusual” blackout that affects four million people.

Sept. 28, 2003: A massive power failure strikes Italy, affecting 57 million people. It's later blamed on a tree that hit a high-voltage transmission line.



Sidebar: A Closer Look at the National Power Grid

In 1965, the failure of a single faulty relay at the Sir Adam Beck Generating station caused a cascading failure that left the entire northeastern U.S. and Canada without power for 18 hours and prompted the creation of the National Electrical Reliability Council (NERC). Since then, the national grid has operated in a fairly robust manner, but that all changed in 2001 with the California energy crises. However, that was only the first of many signs that demonstrated how weak the energy grid was, and its failings eventually culminated in the recent northeast catastrophe.

The U.S. power grid is actually made up of four major sub grids that comprise hundreds of large generating stations, a few thousand small generating stations, and 100,000 substations. The system consists of 680,000 miles of transmission and distribution lines.

The grid's interoperability was initially designed to fortify against single points of failure. Ironically, it's this element that's also the grid's Achilles heal. During peak hours many regions within these grids traverse into the “danger zone,” an area within 5% to 10% of the grid's absolute capacity, operating far beyond their design parameters. Many regions operate on thinner margins on a daily basis during peak conditions, depending on the season.

Ideally a grid should operate with generating stations and transmission and distribution lines with 10% to 15% margin. This usually gives some margin for failure somewhere in the network, allowing alternate sources to mitigate the failures. Any failure on the grid must be instantly isolated within the overloaded area. In other words, the power must be cut to the specific area. Failure to do so will cause the overload to draw power from other regions of the grid, cascading the overload throughout the entire grid similar to a domino effect.

The most common way to control excess demand is via brownouts (sagging the line voltage) or rolling blackouts (taking turns shutting power down to the least critical areas).

About the Author

Alan Katz | MGE and David Pereles

Voice your opinion!

To join the conversation, and become an exclusive member of EC&M, create an account today!

Sponsored Recommendations

Electrical Conduit Comparison Chart

CHAMPION FIBERGLASS electrical conduit is a lightweight, durable option that provides lasting savings when compared to other materials. Compare electrical conduit types including...

Don't Let Burn-Through Threaten Another Data Center or Utility Project

Get the No Burn-Through Elbow eGuide to learn many reasons why Champion Fiberglass elbows will enhance your data center and utility projects today.

Considerations for Direct Burial Conduit

Installation type plays a key role in the type of conduit selected for electrical systems in industrial construction projects. Above ground, below ground, direct buried, encased...

How to Calculate Labor Costs

Most important to accurately estimating labor costs is knowing the approximate hours required for project completion. Learn how to calculate electrical labor cost.