This article is part of an ongoing series on privacy program metrics and benchmarking for incident-response management, brought to you by Radar, a provider of purpose-built decision support software designed to guide users through a consistent, defensible process for incident management and risk assessment. Find earlier installments of this series here.

It’s commonly understood in operations and management practices that measuring performance is one of the best ways to pursue long-term improvements in performance. Consider the H. James Harrington quote, “If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.” Given this, why is it so many privacy programs struggle to effectively measure and report on their work? This article series is devoted to establishing consistent program metrics and benchmarking your privacy incident management program, offering in previous installments examples of how data can fuel incident insights and ultimately inform process improvements

This month’s installment will focus on examples of actionable insights from a relatively straightforward privacy program metric: incident volume over time. Radar incident metadata indicates a wide divergence of incident-volume peaks and valleys over time with very little consistency, regardless of industry. Since every organization will have a different experience of incident volume month over month and year over year,  it is imperative that they measure this over time in order to set expectations and address issues and trends that emerge within the organization's unique culture of compliance.

Incident vs. breach

When we’re looking at incident volumes, keep in mind that an incident occurs when there is unauthorized use or disclosure of regulated data such as personally identifiable information (PII) or protected health information (PHI). An incident is considered a data breach when it meets specific legal definitions per data breach laws. Data breaches typically require notification to the affected individuals, state and federal agencies, and sometimes credit reporting agencies, local law enforcement, or the media. Only a small percentage of privacy incidents escalate into data breaches, and there is much to be learned in the large number of everyday incidents that occur within an organization.

Proactive privacy teams use incident-response information as a catalyst for action. If your current processes are designed to let you track and analyze incident and response trends over time, you can use them to achieve continuous improvement in both privacy and security. When looking at the volume of incidents you see over time, consider the following:

An increase in incident volume is indicator of successful employee training: Many teams conduct annual security training for employees, or on-demand training to address issues as they arise. Privacy teams report that training has a counter-intuitive benefit: more incident reporting. Following a successful training, look for an increase in incident escalation and reporting as a signal of success in effectively conveying your processes and the importance of privacy awareness. Even better, you may see an increase in the number of incidents reported, but a drop in the number of breaches that required reporting.

Real-time daily, weekly, or monthly reporting allows for increased responsiveness to emerging trends: Being able to view the granular details of your incident-response management process allows you to catch performance issues as they emerge. For example, month-over-month increases in incident volume may signal the privacy team to dig into the increase and where it may be coming from. Is there a pattern? Is it an anomaly? Does a sudden spike in incidents from a particular location or department indicate a new systemic or seasonal issue that could be prevented with timely reminders and training?

Recording peaks and valleys in incident reporting allows opportunity to provide attribution: Over time, once you’ve built up your privacy metrics, you’ll be able to compare quarterly trends (always of interest to business unit leaders and executive boards) and year-over-year trends. These metrics can be of interest in identifying seasonality of your business or your privacy program. Some examples: Are you noticing an influx of incidents that correlate to tax season, or a dip in incident reporting as your employees enter the heavy travel days of summer, or simply a drop off at the end of each quarter when your employees are hard-pressed to meet quarterly goals? Having this kind of data can let you know when to best time your training, or ensure your system protection measures and secure and reinforced.

Remember: when it comes to incidents, no news is bad news. If you’re challenged to manage these everyday incidents, you also likely recognize that more incidents can mean more work for privacy teams – after all, every incident, whether it does or does not rise to the level of a data breach, requires multi-factor risk assessment in order to prove compliance. While it’s true more incidents may bring more work to our teams, there are ways privacy teams can bring consistency and efficiency to their processes and better operationalize incident response. Here are a few suggestions. 

  • Streamline incident intake: Is there one mechanism employees across your organization can use to report an incident? Web-forms streamline incident-intake methods and bring consistency to the incident details captured, with timely alerts to privacy and security teams for immediate action.

  • Create built-in and consistent documentation: When an incident is reported, is there a centralized place for your team to add pertinent incident details and risk factors in a consistent and compliant manner, document the outcome of your incident-risk assessment and record notification decision and associated evidence (such as copies of letters and notice execution information)? This is critical in support your organization’s burden-of-proof obligation under breach laws.

  • Automate the multi-factor risk assessment: Technology-aided automation in risk of harm quantification for each incident eliminates the subjectivity, inconsistency, and significant delays in determining whether notification is required — traits inherent to manual approaches. Most incidents are routine and entail similar risk factors that should be scored consistently and without subjectivity, while a very small number of incidents should call for additional considerations beyond the established and approved risk scoring methodology.

Privacy program metrics are not a report card, they signal an opportunity to improve 

When I discuss privacy program metrics among colleagues in the field, hesitancy to measure program outcomes often comes from the known inadequacy of the systems and tools used to produce accurate data for analysis. Additionally, there are no good industry benchmarks to compare against. Consequently, there’s a concern that measuring outcomes under these circumstances could lead to unintended consequences. I can sympathize with these concerns.

But I also believe, to paraphrase Harrington’s quote above, that you cannot understand or improve your work if you do not measure it, and we are all in the business of improving privacy in our organization and protecting the personal data we are trusted with. When your organization holds private, personal data, your organization’s reputation is built on trust. And we are all responsible for measuring and continuously improving our efforts in that pursuit.

About the data used in this series: Radar ensures that the incident metadata we analyze is in compliance with the Radar privacy statementterms of use, and customer agreements. The information extracted from the platform for purposes of statistical analysis is not identifiable to any customer or data subject.