The pervasiveness of and increasing authority vested in artificial intelligence and autonomous systems has created tremendous anxiety amongst the public. This has led to industry- and academic-based initiatives to address ethics in AI/AS through research and public engagement.

The IEEE Global Initiative for Ethical Considerations in the Design of Artificial Intelligence and Autonomous Systems is one such initiative designed to support engineers and serve as a springboard for developing operational IEEE Standards in AI ethics. Last week the IEEE Global Initiative released version one of its working reference, entitled “Ethically Aligned Design: A Vision for Prioritizing Human Wellbeing with Artificial Intelligence and Autonomous Systems.”

I had the honor of working with experts in information privacy and contributing to the Personal Data and Individual Access Control Committee. The committee considered issues and brainstormed notional solutions for managing personal data in an environment in which autonomous systems rely on volumes of information for basic decisions.

The motivation for the Personal Data Committee was to envision mechanisms to address the concerns of data asymmetry – the concern that individuals continue to struggle to control and to derive value from personal information in an algorithmic era defined by data-driven decision-making. Automated systems will only escalate data asymmetry.

To tackle this, the committee proposed recommendations that focus on the principles of standardization of the definitions of personal information, access and control of personal data, clear notice and choice, and privacy and identity education. The hope is that through continued dialogue, public engagement, and the IEEE Standards Process, these concerns and potential solutions can be developed into operational use.

Furthermore, through an open and consensus-driven forum, new issues and possible solutions can be identified as new or unaddressed challenges emerge. Version one of “Ethically Aligned Design” addresses the following issues focused on personal data access control, consent, and management:

1) How can an individual define and organize his or her personal data in the algorithmic era?

Individuals have limited means to control their digital identity. While social media platforms do allow users to personalize their profiles, AI/AS can characterize individuals with much broader data sets than simply social network activity. Therefore, the committee recommends that identity verification resources should be applied to validate, prove, and broadcast an individual’s digital identity.

2) What is the definition and scope of personally identifiable information?

Diversity in definitions of PII has confused attempts to create standards in privacy. The committee recommends that a standardization process take place which emphasizes personal data as a legally protected sovereign asset of the individual globally, locally, and across media.

3) What is the definition of control regarding personal data?

Rather than force the individual to conform to the control terms and policies of outside data handlers, the committee recommends that outsider use of data conform to the individual’s terms. This approach would limit AI/AS from setting the standards by which individuals control their data.

4) How can we redefine data access to honor the individual?

Individuals need to have access to their data and consent to its use at every stage of the information lifecycle and across any ecosystem. This means that privacy controls should be embedded in the design stages of any product and service and any AI/AS use of personal information must take account individual preferences throughout the lifecycle.

5) How can we redefine consent regarding personal data so it honors the individual?

Broad consent that permits diverse use of a multitude of data on an individual serves the data handler rather than the human source. Developing a decision framework that takes individual preferences into account for each data entity or use would provide a systematic and portable method for AI/AS to deliver individual tailored experiences.

6) Data that appears trivial to share can be used for inferences that an individual would not wish to share.

As AI/AS becomes more embedded in all aspects of decision-making, broad data collection will lead to externalities that may put consent controls at risk and violate individual-centered definitions of identity. Therefore, the committee recommends collection limits to match pre-defined use parameters.

7) How can data handlers ensure the consequences (positive or negative) of accessing and collecting data are explicit to an individual in order for truly informed consent to be given?

Over time and as AI/AS evolves, seemingly meaningless data can be aggregated and used in ways that were previously uninteresting. Consent requirements must evolve accordingly and become dynamic, conditional, and provide users with the ability to kill data that is used in a way that is not consistent with the original grant of permission.

8) Could a person have a personalized AI or algorithmic guardian?

The current fractional approach to managing individual privacy creates confusion inconsistent with the explosion of data use. This confusion will only magnify with enhanced and increased use of AI/AS. To centralize individual privacy and identity management, the committee recommends exploration into the development of AI or algorithmic guardians. These guardians will serve as educators, negotiators, and brokers of their user’s personal information within the strict boundaries of their user’s preferences.

The Personal Data and Individual Access Control Committee of the IEEE Global Initiative’s clear emphasis was on standardization, clarity and focus of collection and use, tailored consent, and broad personal access. These principles will guide further discussions, identification of new issues, and standards development as the IEEE Global Initiative continues its work to empower the individual in the face of AI/AS advancement and, perhaps, reduce the public’s anxiety.

photo credit: Circuit Board via photopin (license)