When we consider the potential of the Smart Grid, it’s natural to think in grand terms. We can envision “smart” appliances and home automation tools that enable consumers to respond to real-time fluctuations in energy costs, and fundamentally shift demand.  But as I discussed in my previous post, those kinds of changes will require a major cultural shift—a change in the way ordinary people think about energy consumption—that is likely still a few years off.

We should realize, however, that apart from these loftier goals, the Smart Grid also holds the potential to deliver major savings right away, just by providing more visibility into the grid.

A 2005 study conducted for the Department of Energy estimated that outages cost U.S. consumers and businesses about $80 billion annually. Of course, some outages simply can’t be prevented; a major storm or natural disaster will bring down portions of the grid whether they are sensored or not. But the report notes that, of those lost dollars, two thirds—$52 billion—result from momentary power interruptions of a few minutes or less. It’s in these smaller-scale outages where Smart Grid technology can make a real difference.

Lack of Visibility = Higher Costs

Consider capacitor banks. Faults in capacitor banks can and do cause outages, but most utilities inspect capacitor banks just once or twice a year. By outfitting capacitor banks with monitors, utilities could get bi-weekly reports about voltage readings, heat readings, and other metrics, as well as real-time alarms if something goes severely out of tolerance. Ultimately, they could be much more proactive about identifying and correcting problems before they cause an outage.

Line faults are another area ripe for Smart Grid optimization. Today, most utilities rely on electromechanical fault detectors on individual lines, typically a ball that rotates from black to red in the event of a fault. But most utilities have no centralized mechanism to know when and where a detector has flipped. So, isolating a line fault means sending someone out to drive around and check all the fault detectors. (And if it’s night time, it means driving around with a flashlight!)

If that sounds like searching for a needle in a haystack, it often is. In many cases, the utility simply can’t determine where the fault has occurred—until that fault eventually becomes an outage.

Now, imagine a world where all line detectors communicate with a utility’s operations center. Utilities can regularly ping all fault detectors to check their status,  and any fault triggers an alert, allowing the utility to immediately pinpoint the problem.

This kind of Smart Grid innovation may seem relatively mundane. But, considering there is, on average, one fault detector for every thousand meters, it can make a huge difference in a utility’s ability to prevent outages.

Expanding Intelligence

While preventing short-duration outages would most benefit business and industry, Smart Grid communications would also help ordinary residential customers. Today, most people believe that if their power goes out, the utility is probably already aware of it. Well, the industry may not like to advertise this fact, but most of the time they’re not.

With little or no real-time intelligence about the grid, utilities often have no idea how big an outage is or how many people it affects. In a grid equipped with smart meters, however, they do. Even that basic level of intelligence—where an outage is occurring, how many meters are affected—will make a world of difference. Utilities will be able to better understand the nature of any problem and marshal the right resources to quickly correct it.

So next time you hear about how much Smart Grid technology will cost, remember: there’s another side of the ledger, and Smart Grid savings are likely to be substantial.

What do you think?  What do you see as the cost advantages of the Smart Grid?  We look forward to your comments.