More than 160 countries have omnibus privacy laws, yet business leaders recognize privacy is more than a compliance exercise — it is a business imperative inextricably tied to customer trust. Ninety-four percent of organizations said their customers would not buy from them if they did not adequately protect data. And customers are looking for demonstrable evidence. Ninety-eight percent said external privacy certifications — like ISO 27701 and the APEC Cross-Border Privacy Rules — are important in their buying decisions.

These are some of the findings in the Cisco 2024 Data Privacy Benchmark Study, released 25 Jan., which draws on more than 2,600 anonymous responses from security and privacy professionals in 12 geographies. 

Strong support for privacy laws

Privacy laws put additional costs and requirements on organizations, including the need to catalog and classify data, implement controls and respond to user requests. Despite these requirements, organizations continue to overwhelmingly support privacy laws. Eighty percent of respondents said privacy laws have had a positive impact on them, versus only 6% who said the impact has been negative.   

Attractive economics

Privacy has continued to provide attractive financial returns for organizations around the world. In this year's survey, 95% indicated privacy's benefits exceed its costs. While privacy budgets remained roughly flat, on average, for 2023, at USD2.7 million, the average return on privacy investment was 1.6 times, meaning the average organization gets USD160 of benefit for each USD100 of privacy investment. Thirty percent of organizations are getting returns of at least two times their privacy investment. 

Slow progress on AI and transparency

Consumers are concerned about artificial intelligence use involving their data today and 60% have already lost trust in organizations over their AI practices. Ninety-one percent of respondents said their own organization needed to do more to reassure customers their data was being used only for intended and legitimate purposes in AI. This percentage was 92% last year, which indicates not much progress has been made.    

When asked what they are doing to build confidence in their AI use, organizations cited several initiatives. Fifty percent are ensuring a human is involved in the process, 50% are trying to be more transparent about AI applications and 49% have instituted an AI ethics management program.

Concerns with generative AI

Generative AI applications have the power to use AI and machine learning to create new content quickly, including text, images and code. Seventy-nine percent of respondents said they are already getting significant value from generative AI. But this new tech brings risks and challenges, including the protection of personal or confidential data entered into these tools.   

Over two-thirds of respondents indicated they were concerned about the risk of the data being shared with competitors or with the public. Nonetheless, many of them have entered information that could be problematic, including 48% who said they have entered nonpublic information about the company. Fortunately, many organizations are starting to put in place controls on the tools used or data entered.

Recommendations

The findings point to specific recommendations for organizations' privacy programs:

  • Provide greater transparency in how your organization applies, manages and uses personal data, as this will go a long way toward building and maintaining customer trust.
  • Establish protections, such as AI ethics management programs, involving humans in the process, and working to remove any biases in the algorithms when using AI for automated decision-making involving customer data.
  • Apply appropriate control mechanisms and educate employees regarding the risks associated with generative AI applications.
  • Continue to invest in privacy to realize the significant business and economic benefits for your organization.