TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Why demonstrable accountability matters Related reading: Ensuring that responsible humans make good AI

rss_feed

""

There is “accountability tension” going on that has the potential to stifle technology and data-driven innovation.

With artificial intelligence, including advanced analytics and big data, beginning to affect almost every aspect of society, harnessing its potential for good while addressing its risks will require an end-to-end comprehensive, programmatic, repeatable demonstratable governance system for adoption by all organizations seeking to use complex data analytical processing, such as AI as part of their strategy and objectives. 

At the same time, there is an accelerating trust gap between some regulators and general business practices. Regulators have expressed surprise at how unprepared organizations are to meet even legal compliance-driven requirements.

The Information Accountability Foundation has been leading the accountability movement for more than a decade, first with the Global Accountability Dialogue and then in 2018 with the release of Ethical Data Stewardship Accountability elements. Accountability is a basic tenet of 21st-century data protection law and governance. It is referenced explicitly in the EU General Data Protection Regulation, Canada’s Personal Information Protection and Electronic Documents Act, and the APEC Privacy Framework. 

This trend to focus more on accountability has continued with the recent release of draft guidance to support the new Singaporean Data Protection Law and draft legislation introduced in Canada, Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act, as an update to PIPEDA.

The IAF, as its name suggests, has been an advocate for building out measures so organizations can manage and use data in a responsible and answerable manner. To be trusted, accountability has always needed to be demonstrable. The IAF, which studies policies and practices necessary to use data wisely in a manner that serves people, is finding that organizations need to be explicitly accountable in a more formal manner. 

Why? More complex data analytical processing requires more accountability, and there is increasing evidence that regulators do not trust the current state of accountability. Regulators report they do not find accountability measurable and adequately implemented. For example, in its 2018-2019 Report to the Parliament of Canada, the Office of the Privacy Commissioner criticized the governance component of accountability when it stated:

format_quoteWe are increasingly noticing the ways in which [accountability] has become deficient in today’s world of complex data flows and less than transparent business models. Our recent investigations into Facebook and Equifax, for example, revealed that accountability as traditionally framed in the law is not strong enough to protect Canadians from the intrusive practices of companies who say they are accountable, but are in fact found not to be.

Against this backdrop, there are a number of leadership companies that are going beyond compliance and adopting “trust”-driven governance approaches because they view trust as a business-critical goal. Their goals are to make sure their technology, processes and people are working in concert to maintain the high levels of trust expected by their many stakeholders. In short, they are implementing trust driven, beyond legal compliance, demonstrable accountability processes. 

Fair processing will be key to the responsible use of technology and data that is trusted. The IAF has released a report to address the "Movement Towards Demonstrable Accountability — Why It Matters." This report:

  • Profiles what these companies are doing to build trust in data innovations through enhanced accountability.
  • Suggests some considerations for other organizations and for regulatory guidance and future public policy.

This combination of the trust gap and a study of what these leadership companies are doing also suggests a need to evolve the Ethical Data Stewardship Accountability elements, the IAF introduced in 2018 to encompass “fair processing.”

This report also outlines new Fair Processing Demonstrable Accountability Elements that will enable many of the economic benefits the use of advanced analytics that drive technologies like AI can bring to individuals, groups, society and organizations while meeting the broader needs of these groups relative to ethical and fair data processing. 

The IAF believes these new elements advance what the original Essential Elements of Accountability are capable of doing in two key ways: First, when implemented, they facilitate the trust necessary to enable the adoption of data-driven technologies like AI and its associated data use. Second, they demonstrate accountability that goes beyond compliance. The original Essential Elements of Accountability, while key, simply meet compliance objectives regarding existing law. For these organizations, while “trust” is the business objective, an outcome of these leading organizational practices is a demonstrable programmatic approach to “fairness.”

This system will be a step up beyond the accountability required for less complex data processing and will require a “trust” driven approach rather than just a “legal” compliance approach to accountability. This result moves accountability based on legal privacy and data protection requirements to accountability based on “fair processing” of data.

These leadership companies have basic accountability mechanisms that are easily demonstrable and adequately implemented. 

However, these leadership companies have gone well beyond the basics of accountability. The IAF learned these companies are implementing demonstrable accountability against a broader objective of trust. These findings match the research in The Ohio State University report, "Business Data Ethics: Emerging Trends in the Governance of Advanced Analytics and AI," which concluded that a group of organizations implement ethical approaches that go beyond legal compliance objectives to build trust in complex data analytical processing. While ensuring they meet compliance-based requirements, these leadership companies have shifted the focus to fair processing. Regulators could look more to these leadership companies as examples of demonstrable accountability.

Organizations that create products and services and make decisions based upon a demonstrable accountability foundation to build trust can earn the ability to use advanced analytical data processing and AI to their full potential. This is why demonstrable accountability matters.

Photo by Romain Dancre on Unsplash


Approved
CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.