Two decades ago, the National Institute of Standards and Technology released a special publication with guidance for organizations that wanted to build an "information technology security awareness and training program." The guide was meant to help inform federal agencies in the development of all-staff training programs but was also targeted at a wider audience of any organization with a similar goal.
Privacy wasn't in the picture at all in 2003, but now it's front and center. And, as every U.S. Federal Trade Commission consent order reminds us, nowadays, privacy training programs are an essential component of a robust privacy strategy for all organizations.
This week, the NIST released a proposed draft of an updated version of this special publication with a new title: Building a Cybersecurity and Privacy Learning Program. Two of the stated goals of the update will be familiar to many privacy professionals who build or manage internal training programs in their organizations:
- Integrate privacy with cybersecurity in the development of organization-wide learning programs.
- Introduce a life cycle model that allows for ongoing, iterative improvements and changes to accommodate cybersecurity, privacy, and organization-specific events.
The NIST points to the Office of Management and Budget's groundbreaking Circular A-130 as part of the inspiration for combining privacy and security training into a single holistic approach. As the agency explains, Circular A-130 "emphasizes the role of both privacy and security in the federal information life cycle and requires agencies to have both security and privacy awareness and training programs."
Similar trends have taken place in the commercial sector, as explored in this joint whitepaper from IAPP and (ISC)2. The privacy and cybersecurity professions are increasingly interdependent. As the paper highlights, "crossfunctional training and communication will be ever more necessary in building the next generation of data stewards."
The NIST's proposed update takes this philosophy to heart, focusing on the interdependent and complementary objectives of the cybersecurity and privacy profession. It proposes one of the most rigorous definitions of the distinction between the fields that I have seen:
Cybersecurity programs are responsible for protecting information and information systems as well as operational technologies from unauthorized access, use, disclosure, disruption, modification, or destruction (i.e., unauthorized system activity or behavior) in order to provide confidentiality, integrity, availability and safety.
Privacy programs are responsible for managing the risks to individuals associated with data processing throughout the information life cycle in order to provide predictability, manageability, and disassociability, as well as ensuring compliance with applicable privacy requirements.
The draft publication goes on to emphasize the marriage between these roles, and the importance of developing a strong internal culture of privacy and cybersecurity. It provides a detailed roadmap for how to do so in an ongoing, adaptable way. It also provides tips for segmenting training programs based on the level of privacy exposure of the individual's role — from all-staff trainings to those with "significant" privacy and cybersecurity responsibilities.
Ongoing training and continuous improvement are essential components of privacy programs that often don't get as much attention as flashier elements of operational governance. Whether an organization builds a program from scratch or relies on outside modules like IAPP's Foundations of Privacy and Data Protection, ongoing trainings should be thoughtfully designed to mesh into the workflow and risk profile of the organization.
Federal government agencies are under affirmative obligations to conduct this training, thanks to the Privacy Act of 1974 and OMB Circular A-130.
Similar training requirements are spreading across consumer privacy laws in U.S. states. And federal proposals like the American Data Privacy and Protection Act may one day require robust privacy training programs too. ADPPA's most recent public text includes a risk-based approach to mandatory trainings, requiring both controllers and service providers to:
"Implement reasonable training and safeguards within the covered entity and service provider to promote compliance with all privacy laws applicable to covered data the covered entity collects, processes, or transfers or covered data the service provider collects, processes, or transfers on behalf of the covered entity and mitigate privacy risks, including substantial privacy risks, taking into account the role of the covered entity or service provider and the information available to it."
Whether under affirmative training obligations or not, ongoing education and culture building around privacy best practices is essential for mitigating privacy risks. NIST's new special publication offers a helpful starting point.
The NIST is seeking public comments on the draft, due 27 Oct.
Here's what else I'm thinking about:
- Another federal bill for kids online safety brings another round of intense advocacy against it. Electronic Frontier Foundation's Jason Kelley and Sophia Cope posted a blog explaining their concerns with the Protecting Kids on Social Media Act. The bill was introduced by Sen. Brian Schatz, D-Hawaii, and includes a collection of ideas from similar federal and state legislation, including parental consent for teen use of social media, age verification requirements, and a ban on recommendation algorithms for youth.
- If the FTC approves face scanning for verified parental consent, it should require independent audits of the tool. Or so claims a set of privacy advocacy groups in a comment to the FTC in response to its request for input on the application to allow a new method of VPC under the Children's Online Privacy Protection Act. The comment also raises questions about whether the method can effectively verify parental relationships, or simply adult status. Compare this with an earlier op-ed explaining the arguments in favor of approving facial age-estimation as a VPC method.
- One day soon there will probably be an EU AI Act. If you need to catch up on just what is being considered and debated in the current trialogue, and how it relates to data protection standards, my colleague's analysis is a great place to start.
Please send feedback, updates and training incentives to email@example.com.
If you want to comment on this post, you need to login.