TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout


In 2010, the National Institute of Standards and Technology (NIST) published its first interagency report on guidelines for smart grid cybersecurity. Since then, developments in technology and implementation have required a second draft, resulting in the birth of Volume 2, and those who've worked over the years at revamping the guidelines hope they'll form the basis for state laws and corporate policies in the name of the smart grid's eventual success.

Four years ago, the guidelines were more about theory, but the new recommendations respond to the actual deployment of smart grid technologies, and those involved say the guidelines that have especially changed are those surrounding privacy.

The first draft focused largely on the fact there would be privacy risks and what those risks might be, whereas the revision identifies tactical and actionable steps that can be taken to mitigate those risks. It focuses on the need for utilities within the smart grid to create a privacy program; conduct privacy impact assessments; assign a person to privacy oversight, and create training and awareness opportunities, among others.

Rebecca Herold, CIPP/US, CIPM, CIPT, has been running NIST's smart grid privacy subgroup since 2009 and says both the group and the process have evolved significantly since their early beginnings.

"Back then, people weren't too worried about privacy with regard to how it related to electricity usage," she recalls.

That shifted some as well-known utility companies started to roll out smart meters, which record household energy consumption and communicate it back to power providers, without effectively communicating the kind of data those meters would collect, how granular it would be and who'd have access to it. But consumers really flipped after a highly hyped documentary called "Take Back Your Power" debuted, essentially asserting that smart meters are the devil to anyone who cares about privacy, their utility bills or getting hacked.

The film had a lot of erroneous messaging, Herold said, but it resulted in panicked consumers making a lot of phone calls to their utility company, which, in the end, meant utilities understood the need for better awareness and training on transitioning to the smart grid.

And it's a good thing, because the privacy subgroup's composition shifted from its 2009 roster and the release of the first report. At that time, it contained 95 to 100 members—many of them privacy advocates from groups like EPIC and the CDT and the FPF. But it shrank to just 26 members once NIST stopped funding the group and it was put under the purview of the Smart Grid Interoperability Panel (SGIP). Upon that change, members of the subgroup were required to pay to get in. Now, there are about three privacy experts on board.

"Once they were gone, it was a little bit of a challenge to keep perspective of, 'Why do you bother with this privacy stuff?'" said NIST's Tanya Brewer, one of the privacy experts on board.

However, she adds, the utilities were certainly more receptive this time around, equipped with a nuanced understanding that privacy must be a foundational part of implementation rather than an afterthought, or consumers aren't going to get on board.

Even initially, it wasn't that the utility reps had an axe to grind with privacy, Brewer and Herold said. But it can be a tough sell if you're a privacy pro at a utility company and you don't have the right sales pitch.

"It wasn't that any of the other member were anti-privacy," Brewer said. "They weren't at all. But they had to quantify the cost of things." She said they were looking at things much more from the perspective of how much it would cost and how they could sell it to their board if they didn't have a good case to make. "So we had to kind of look at this from the perspective of the utilities a little harder," Brewer said. "It's true, they live by a bottom dollar and by what their boards tell them."

It's just like selling your company on a computer security budget, Brewer said. Sure, there might be a problem if you don't amp up your security. But how big of a "might" is it? Enough to spend thousands? Millions? That matters when you're drafting guidelines you're asking major corporations to adopt.

"We had to look at things and be pretty realistic," Brewer said.

It was clear that privacy safeguards weren't going to be one-size-fits-all for every utility. As a result, the recommendations set forth in NISTIR76-28, Revision 1, are scalable to various systems, depending on an individual company's determined risks.

Those risks were largely developed based on use cases derived from the expertise of engineers on the ground.

"We spent a year just trying to produce use cases on their own because we would create them and get feedback from engineers like, 'We really don't understand this; we don't understand where we'd apply this or what it means'," Herold said. "That's when finally we changed. We could produce use cases, but if they don't understand where privacy exists as part of these use cases, they can't effectively build these controls into the systems."

Herold and Brewer feel confident with this latest draft, which includes trainer slides and tools to gauge privacy awareness among employees—not to be used for grading but to indicate where more training is necessary.

And the future looks bright, compared to 2009.

"At the beginning, that was always the pushback," Herold said. "We can't afford training and awareness. But now, some of the people that were the biggest roadblocks are now the ones that are agreeable, and I think the tide is changing."

Brewer agrees.

"I think we've helped them be able to take some language back to the top decision makers," she said.


If you want to comment on this post, you need to login.