In 2023, spending on artificial intelligence systems will reach $97.9 billion, more than 250% of the value in 2019, according to IDC research. “Software is eating the world, but AI is going to eat software” is how NVIDIA CEO Jensen Huang famously put it. This ongoing AI revolution requires collecting enormous amounts of data. Across industries, from healthcare and retail to automotive and public transport, computer vision and video analytics are key techniques in AI to fuel new digital solutions.
At the same time, there is an increasing debate around the use of AI systems based on video data. Especially the use of facial recognition technology during protests in the U.S. and Hong Kong has sparked fierce public debates. Amid this tension between innovation and privacy, regulators worldwide are trying to establish clearly defined rules for the digital revolution.
A study by brighter AI in cooperation with the German AI Association summarizes the privacy regulations of six major markets: the EU, the U.S., China, Japan, South Korea and Brazil. (Editor's note: The study was published before Brazil's General Data Protection Law went into effect.) With a focus on video processing for AI and analytics, the report highlights views and best practices from industry leaders and privacy experts.
Global projects, local laws
Overall, the study found that while data collection and AI projects are increasingly borderless, privacy laws are not. For many industries, it is not only the multinational setup with geographically dispersed expert teams that makes AI projects inherently global. Take the development of autonomous vehicles: To develop safe cars for international markets, neural networks need to be trained on data collected on roads worldwide.
Currently, the EU General Data Protection Regulation is the only privacy law that includes multiple countries (27 member states). In the U.S., the picture is quite the opposite, and there are different privacy laws across states. Furthermore, some regulations have a lot of additional complexity. In China, for example, different guidelines with respect to data privacy exists, including mainly the Cybersecurity Law and the Personal Information Security Specification. Only a few — partially contradicting — translations exist and responsibilities are divided between different authorities, leaving international data controllers with uncertainty.
Privacy laws on the rise
Even though these uncertainties exist in China, a country known for lightspeed innovation and public facial recognition systems, privacy laws are becoming stronger. Especially foreign companies that engage in video data projects must be aware of the laws and need to anonymize personal information from datasets.
Several countries in the Asia Pacific region are aligning more closely with the GDPR and began to set up more stringent data protection laws. The adequacy decision by the European Commission toward Japan’s Act on the Protection of Personal Information and ongoing talks between the EU and South Korea are signs of increasing legal homogeneity.
Despite the Court of Justice of the European Union’s ruling against the EU-U.S. Privacy Shield, the introduction of the California Consumer Privacy Act makes it clear that some key principles, such as the “right for deletion,” are gradually becoming global standards. The law is even more remarkable as California is home to the digital powerhouse Silicon Valley and the world’s fifth-largest economy (if it would be a country). Due to privacy bills in other U.S. states, like Washington, and to create equal rights for all Americans, there is a growing number of voices calling for countrywide privacy regulations.
In Brazil, the LGPD is now in effect and its principles and hefty fines of up to 2% of revenues clearly have parallels to the GDPR.
Privacy by design as a competitive advantage
To comply with privacy laws in international AI projects, it is advisable to understand each market. Here, companies should not look out for loopholes, but rather align practices to the strictest rules possible.
Besides increasing pressure to fulfill legal requirements, insights from business, tech and privacy leaders show, companies increasingly find value in compliance and embracing privacy by design. Embedding the best possible data privacy settings and technical means, including data pseudonymization and anonymization, directly into processes and products can become a significant advantage towards sustainable AI and computer vision solutions. Business partners and regulators will acknowledge this — and, most importantly, your customers.
Photo by Michael Dziedzic on Unsplash
If you want to comment on this post, you need to login.