TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Privacy: An organization’s responsibility for building trustworthy systems Related reading: New CPPA Board member appointed

rss_feed

An organization's handling and use of individual data will impact its long-term success as consumers and governments become more concerned with data privacy and protection. Businesses can no longer sit back and wait to react to these changing market forces. They must realize that building trust with consumers is an essential tenet for their success, and respecting individual privacy is a crucial component of trust. The purpose of this piece is not to discuss regulation but how privacy enforcement and operationalization is possible when treated as an ethical concept. 

Privacy as an ethical concept 

Philosophers, legal scholars, scientists, and governments have defined the concept of privacy in several ways, highlighting the pluralistic, cross-disciplinary, and context-dependent nature of privacy, similar to ethics. For instance, privacy has been defined as the focus on control over information about oneself, while others describe it as a broader concept required for human dignity. Similar to ethical dilemmas Immanuel Kant has discussed, organizations face privacy dilemmas while translating these concepts into practice. A good example of a privacy dilemma is contact tracing during COVID-19, as collecting geo-location data was required to contain the spread of the pandemic, leaving no choice for individuals.

Increased awareness among individuals and governments has led to a lot of privacy policy research and, consequently, the enactment of privacy regulations, including the EU General Data Protection Regulation, the Personal Information Protection and Electronic Documents Act, and the California Consumer Privacy Act. But unlike security, there is a lack of industry-specific technical standardization and maturity, causing complications and confusion. Due to this, organizations strive to achieve check-box compliance to avoid penalties imposed by regulations: highlighting the "Accept All Cookies" button and greying out a "Reject All Cookies." which causes confusion and is unfavorable for privacy but still passes a privacy audit.

Since privacy regulations, by nature, are written somewhat vaguely, they are open to interpretation depending on the context, politics and resolved legal cases. Daniel Solove categorized privacy harms by reviewing U.S. privacy cases in several circuit courts and the U.S. Supreme Court demonstrating the inconsistent interpretation of privacy regulation over the years. Further, recent events, such as the overturning of the Roe v. Wade judgment that was based initially on "Right to bodily privacy," weakens the overall notion of privacy. Therefore, regulations can be unreliable when respecting individual privacy, and an organization's dependence on their fuzzy interpretation can be more detrimental than no compliance.

Organizations demonstrate their commitment to safeguarding individual privacy against external privacy attacks by utilizing privacy-enhancing technologies such as differential privacy and homomorphic encryption. Still, PETs cover only one aspect of privacy protection against illegitimate third-party use. They do not protect individuals' right to control their information from the organization's use, processing and distribution in exchange for a service.

When viewed as an ethical concept, privacy can help organizations holistically approach technical system design, navigate privacy dilemmas, challenge them to develop creative solutions, and instill a culture of respecting individual privacy. The organization's motivation changes from avoiding fines and business favorable interpretation of regulations to building trustworthy relationships with the consumers. 

Privacy: An organization's responsibility 

There is a notion that privacy needs to be traded off with the system's utility, especially when consumers tend not to bear any direct cost of privacy (privacy paradox), discouraging organizations from figuring out alternate solutions. However, research has shown the average organization is getting benefits worth 1.8 times their privacy investment for every dollar spent on privacy. 

Organizations who make privacy a business priority get higher returns than organizations that deprioritize privacy. Ensuring individual privacy does not come automatically but should be treated as a business motivation and a core responsibility in software development. 

Privacy as an ethical concept encourages organizations to actively operationalize it in the culture and translate it into innovative solutions. It makes everyone in an organization, not just security, privacy or compliance personnel, accountable for protecting individuals' right to privacy. This would also mean providing the right processes to all the individuals to report privacy issues, as well as building technologies that collect, process, and disseminate information respecting individuals' choices. Privacy due diligence and due care should be reflected in technical innovation and privacy engineering to help organizations reap the benefits of a trustworthy relationship with their consumers.

Relation to trust and organizational success

Today's individuals associate the way their data is treated with how they are treated directly. Research found that many individuals feel resigned to the idea that organizations do not care about their data. This feeling is worsened by the difficulty to analyze privacy notices that do not, according to Solove, provide information meaningful enough for informed decision-making. Over time, these feelings erode trust between consumers and the organization. Privacy, when viewed as a core business priority and a component of trust, can drive business outcomes, increase consumer loyalty, improve brand perception and lead to higher earnings. 

 Special case of privacy in AI systems

Artificial intelligence systems, by default, require more and more data to draw correlations, contrary to the data minimization principle for privacy. When used responsibly and not just for an organization's benefit, AI can be extremely helpful in generating social good. But they can be perceived as opaque, increasing distrust without establishing and operationalizing the principles reflecting the organization's values, commitment to inclusion, privacy, safety and transparency. Maintaining the balance between privacy, utility, and fairness are key ingredients in building a trustworthy and functional AI.

Privacy-preserving mechanisms protect against malicious attacks such as reverse inference and membership inference, but respecting individual privacy starts before PPM is introduced. Identifying the machine learning objective, developing a purpose map, and weeding out unnecessary data is a good place to start. Respecting an individual's choice to participate in the ML model, providing them visibility and channels to contest, ask questions, and be forgotten can earn their trust and even encourage them to be willing to share more data. Hence, AI systems should be approached holistically, respecting the rule of law, human rights, dignity, and our shared values of ethics, inclusiveness, and privacy.

The path forward: Operationalizing privacy 

We propose approaching privacy as the organization's commitment to respect an individual, which can help instill trust for long-term business benefits. We propose:

  • Establishing an ethical code of conduct with privacy as a core value.
  • Translating it into tenets to develop practical approaches to privacy challenges and designing to earn trust.
  • Utilizing privacy best practices throughout the program's lifecycle, from design to delivery.
  • Monitoring each phase for compliance against the tenets.
  • Designing flexible systems to accommodate for deviations and feedback. 

Depending on the program tenets developed, one could argue this might lead to a weaker notion of privacy. However, we suggest building program tenets using regulations as a baseline and insisting on the highest standards, earning individual trust as a goal. Further, using objective information metrics following a regulation-agnostic approach, identifying design patterns and anti-patterns, and making responsible efforts to be transparent about functionality design can help translate tenets into actionable privacy controls. 

A successful privacy program will require executive ownership showcasing their intention and commitment to transparency, privacy, and security when leveraging technology to build trustworthy systems. Training and educating the product owners and builders and having diverse perspectives can help identify privacy requirements and their repercussions, deal with privacy-related trade-offs and dilemmas, and produce a reliable system backed by conscience and regulations. 

Conclusion

Our call to action is to prioritize "doing the right thing" over the speed of delivery in order to have a higher return on investment while being better prepared to handle privacy regulations. As pointed out in this essay, translation of privacy into a technical language can be a complex challenge, so organizations must start by viewing it as an ethical concept and an important responsibility while taking intentional steps toward creating a culture that respects individuals. 


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.