When Microsoft’s Senior Privacy and Safety Strategist Tracy Ann Kosa started her job 16 months ago after years as a civil servant for the government, she really had no idea what to expect. In fact, Microsoft didn’t either. There was no one working on privacy measurement at the time, and the organization had some concerns about her role, too, Kosa told the audience at the IAPP Canada Privacy Symposium during her session, “Measuring Privacy.”
Kosa quickly learned that her work at Microsoft would involve more than checking the proverbial boxes on compliance. She would need to actually drive the business plan. But to do that, she’d need to justify her existence, so to speak.
Now that she worked in a big privacy office, she needed to provide evidence that her role, and the resources being dedicated to it, was essential by spelling out the opportunities for measurement she could provide and the ways the company could leverage those metrics for success.
Specifically, that meant providing evidence of data privacy compliance, data-driven decision-making and the overall impact of the privacy program.
“If you’re a big company with tons of resources, you probably have a risk process in place,” Kosa said. “If you’re a privacy officer in a smaller company, you are the risk process.”
A really good place to start is to figure out where the antitheses lie. If there are aspects of the program that simply will not ever comply with the legal or regulatory requirements, highlight those first.
“Those are the things where you can say confidently, we can’t meet this requirement because it’s the antithesis of what we’re trying to provide as a service,” she said. “The privacy professional’s job isn’t to say no, it’s to surface the data to say, ‘Okay, that’s illegal.’ Then you let someone with a high-ranking pay make a decision about that. That’s our job, to say, this is what we can and can’t do.”
What Do I Even Measure?
The first step in taking a measurement, though, is deciding what’s important to measure. And that means first looking at your privacy office’s model. There’s the hub-and-spoke model, in which there’s a central privacy office and then a number of people—at Microsoft, it’s 300 or so—who work in privacy but are embedded in various groups and work alongside folks with the project architects themselves.
Kosa herself is about five steps removed from the customer and handles policy-setting, the official corporate response to the Snowden revelations, the communication of the message and the creation of privacy tools.
If your organization is using the “hub” model, first ask questions like, “Do we have a program in place? Do we have tools? Do we have a system? Can we tick all those boxes?” Kosa said.
That’s especially important in Canada, where the federal privacy law mandates that organizations have a chief privacy officer (CPO) to oversee compliance. But, Kosa asks, is there documentation to prove you have a CPO? If the CPO quit tomorrow, could you prove you had one?
Don’t Waste Your Time Collecting Data
Essential to measuring risk is creating a usable report that provides the necessary, digestible metrics to leadership. But before you begin that process, Kosa warns, decide what is important and ask leadership exactly what it wants to know.
“I spent six months looking at ‘what data do we collect?’” Kosa said. “Don’t do that. It’s a huge waste of time. Spend way more time asking, ‘What is the question I’m trying to answer’ and then figure out what data you need.” She advises to ask leadership, “‘When I give you this report you’ve hired me to create, what do you want to know?’ Because your senior leader probably has some other question besides what you think you’re answering. Answer that question first, because that’s how you’re going to show them there’s some value there.”
Set Targets
When you’re reporting back numbers, it’s easy for management to get lost in them. What do they mean? What if you report that your evaluation turned up that you got a score of six? What does that mean?
It’s also essential to spell out how you measured the program. Where did the organization want to be? It’s possible that it was realistically only aiming to land a score of five out of 10, knowing it doesn’t have the resources or ability to hit a nine. Set those targets prior to producing a report, Kosa said, adding, “You don’t want to report on numbers until you’ve got answers to those questions.”
Once the targets have been established, then you can start collecting the data. If you’ve established that you’re aiming to hit 50 percent of compliance requirements, then you can start looking at outcomes. How many data sets are being looked at by the privacy person?
A good model to use in your initial metrics report is the Generally Accepted Privacy Principles (GAPP). Then:
They may seem like minor details, but color-coding the report to indicate your program’s relative maturity level can be helpful with getting the message across. Maybe color-code a top score of five, for example, in green, and a score of three-out-of-five in yellow. According to GAPP standards, a score of three across the board would indicate a fairly robust privacy program.
Be careful not to tell the businesses concerned they’re doing a “good enough” job, Kosa warned.
“If you tell the businesses they are doing privacy well, they’ll stop doing it so well,” she said. “So get your baseline score; play with it; decide where it should be. That’s the beauty of qualitative numbers rather than quantitative ones. You need to find a way to express zero.”
The next and final step is publishing a report. And that’s going to be a heavy lift.
“Do not underestimate the amount of time you’ll have to spend communicating about this,” she said. “People freak out when you measure things. Everybody snaps back to kindergarten when they got “fair,” “good” and “very good” ratings.”
The strategy once the report publishes is going to be communicate, communicate, communicate, she said.
“Talk about what you’re going to do; describe what you want to do,” she said. “Tell them you just want to collect the data, but it’s not for public consumption. Talk to stakeholders. Don’t ask for permission; just tell them that’s what you’re doing. You don’t need everyone to agree, but you want them to know what’s going on.”
Finally, Kosa said, don’t be unrealistic about what you’re going to get out of your metrics.
“You’re not going to know everything,” she said. “The goal here is to try and get an idea of what you don’t know, what potentially is happening that’s good that you could really leverage and what’s happening that’s bad that could potentially hurt you.”
Editor’s Note: A previous version of this story misnamed the Generally Accepted Privacy Principles as the Generally Accepted Accounting Principles.
Kosa quickly learned that her work at Microsoft would involve more than checking the proverbial boxes on compliance. She would need to actually drive the business plan. But to do that, she’d need to justify her existence, so to speak.
Now that she worked in a big privacy office, she needed to provide evidence that her role, and the resources being dedicated to it, was essential by spelling out the opportunities for measurement she could provide and the ways the company could leverage those metrics for success.
Specifically, that meant providing evidence of data privacy compliance, data-driven decision-making and the overall impact of the privacy program.
“If you’re a big company with tons of resources, you probably have a risk process in place,” Kosa said. “If you’re a privacy officer in a smaller company, you are the risk process.”
A really good place to start is to figure out where the antitheses lie. If there are aspects of the program that simply will not ever comply with the legal or regulatory requirements, highlight those first.
“Those are the things where you can say confidently, we can’t meet this requirement because it’s the antithesis of what we’re trying to provide as a service,” she said. “The privacy professional’s job isn’t to say no, it’s to surface the data to say, ‘Okay, that’s illegal.’ Then you let someone with a high-ranking pay make a decision about that. That’s our job, to say, this is what we can and can’t do.”
What Do I Even Measure?
The first step in taking a measurement, though, is deciding what’s important to measure. And that means first looking at your privacy office’s model. There’s the hub-and-spoke model, in which there’s a central privacy office and then a number of people—at Microsoft, it’s 300 or so—who work in privacy but are embedded in various groups and work alongside folks with the project architects themselves.
Kosa herself is about five steps removed from the customer and handles policy-setting, the official corporate response to the Snowden revelations, the communication of the message and the creation of privacy tools.
The privacy professional’s job isn’t to say no, it’s to surface the data to say, ‘Okay, that’s illegal.’ Then you let someone with a high-ranking pay make a decision about that. That’s our job, to say, this is what we can and can’t do.Tracy Ann Kosa
If your organization is using the “hub” model, first ask questions like, “Do we have a program in place? Do we have tools? Do we have a system? Can we tick all those boxes?” Kosa said.
That’s especially important in Canada, where the federal privacy law mandates that organizations have a chief privacy officer (CPO) to oversee compliance. But, Kosa asks, is there documentation to prove you have a CPO? If the CPO quit tomorrow, could you prove you had one?
Don’t Waste Your Time Collecting Data
Essential to measuring risk is creating a usable report that provides the necessary, digestible metrics to leadership. But before you begin that process, Kosa warns, decide what is important and ask leadership exactly what it wants to know.
“I spent six months looking at ‘what data do we collect?’” Kosa said. “Don’t do that. It’s a huge waste of time. Spend way more time asking, ‘What is the question I’m trying to answer’ and then figure out what data you need.” She advises to ask leadership, “‘When I give you this report you’ve hired me to create, what do you want to know?’ Because your senior leader probably has some other question besides what you think you’re answering. Answer that question first, because that’s how you’re going to show them there’s some value there.”
Set Targets
When you’re reporting back numbers, it’s easy for management to get lost in them. What do they mean? What if you report that your evaluation turned up that you got a score of six? What does that mean?
It’s also essential to spell out how you measured the program. Where did the organization want to be? It’s possible that it was realistically only aiming to land a score of five out of 10, knowing it doesn’t have the resources or ability to hit a nine. Set those targets prior to producing a report, Kosa said, adding, “You don’t want to report on numbers until you’ve got answers to those questions.”
Once the targets have been established, then you can start collecting the data. If you’ve established that you’re aiming to hit 50 percent of compliance requirements, then you can start looking at outcomes. How many data sets are being looked at by the privacy person?
A good model to use in your initial metrics report is the Generally Accepted Privacy Principles (GAPP). Then:
- Choose a tool to use, like a simple Excel spreadsheet.
- Collect the data and score yourself based on the targets you’d set.
- Analyze that data, and check it against what GAPP calls for.
- From there, it’s time to create a report and make recommendations.
They may seem like minor details, but color-coding the report to indicate your program’s relative maturity level can be helpful with getting the message across. Maybe color-code a top score of five, for example, in green, and a score of three-out-of-five in yellow. According to GAPP standards, a score of three across the board would indicate a fairly robust privacy program.
Be careful not to tell the businesses concerned they’re doing a “good enough” job, Kosa warned.
“If you tell the businesses they are doing privacy well, they’ll stop doing it so well,” she said. “So get your baseline score; play with it; decide where it should be. That’s the beauty of qualitative numbers rather than quantitative ones. You need to find a way to express zero.”
The next and final step is publishing a report. And that’s going to be a heavy lift.
“Do not underestimate the amount of time you’ll have to spend communicating about this,” she said. “People freak out when you measure things. Everybody snaps back to kindergarten when they got “fair,” “good” and “very good” ratings.”
The strategy once the report publishes is going to be communicate, communicate, communicate, she said.
“Talk about what you’re going to do; describe what you want to do,” she said. “Tell them you just want to collect the data, but it’s not for public consumption. Talk to stakeholders. Don’t ask for permission; just tell them that’s what you’re doing. You don’t need everyone to agree, but you want them to know what’s going on.”
Finally, Kosa said, don’t be unrealistic about what you’re going to get out of your metrics.
“You’re not going to know everything,” she said. “The goal here is to try and get an idea of what you don’t know, what potentially is happening that’s good that you could really leverage and what’s happening that’s bad that could potentially hurt you.”
Editor’s Note: A previous version of this story misnamed the Generally Accepted Privacy Principles as the Generally Accepted Accounting Principles.