DeepSeek's ascension in popularity over the weekend caught the eye of investors in Big Tech and digital privacy groups alike. It also caught the attention of Italy's data protection authority, the Garante, which formally inquired how the technology startup handles the personal data of Italian citizens when they use its generative AI app.

The Chinese AI startup caused a shakeup in the stock market the week of 26 Jan. after its released its newest model, R1, and claimed it performed comparably to OpenAI's latest model — despite being trained for far less money and with less powerful computer chips. The app, which like all of DeepSeek's models, is fee to download and use and rose to the top of Apple and Google's app stores.

This spike in popularity was the main impetus for consumer advocacy group Euroconsumers and its member group Altroconsumers to request Italy's data protection authority take a look at the app's data and privacy protection policies, according to a spokesperson. A Belgian affiliate group, Testachats, has also asked its local authority to investigate.

"The privacy policy on the official website of the joint controllers, in the opinion of the undersigned organizations, reveals multiple violations of European and national data protection regulations," Euroconsumers and Altroconsumers wrote to the Garante.

The groups argue DeepSeek's privacy policy violates several aspects of the EU General Data Protection Regulation. It specifically names provisions related to legal basis for processing data, provides incomplete information about things such as retention periods and categories of personal data, profiling, how data subjects can exercise their rights to deletion and access and how minors' data is handled.

The policy itself states it stores data in China, but notes, "Where we transfer any personal information out of the country where you live, including for one or more of the purposes as set out in this Policy, we will do so in accordance with the requirements of applicable data protection laws." It claims it does not collect any data from minors without consent from a guardian.

The situation has echoes of the scrutiny placed upon the popular video platform TikTok, which has faced investigations in the U.S. — as well as a law outlawing its use in the U.S. unless the American side of its business is bought by a U.S. buyer — and in Europe over whether the data it collects from users is stored in servers located in China.

The fallout from DeepSeek's rise has been swift. The Garante gave DeepSeek 20 days to respond to its request for information about its data practices. The White House is also looking into DeepSeek over national security implications, Axios reports.

TechCrunch reported the app has disappeared from Apple and Google's app stores. DeepSeek's app also sustained a cyberattack and was temporarily limiting registrations after its spike in popularity 27 Jan., according to Reuters.

Companies are taking notice, too. OpenAI is probing whether DeepSeek used a technique called distillation — making several queries to an AI model as a way to extract data — to train its own product, The Wall Street Journal reports. Similarly, Meta is trying to reverse engineer DeepSeek's latest AI model to understand how its own open-source AI technology was used, The Information reports.

Cliff Steinhauer, CIPP/US, the director of information security and engagement at the National Cybersecurity Alliance, said DeepSeek's rise shows the need for greater AI data privacy and copyright protections across regulatory environments.

"Chinese AI companies operate under distinct requirements that give their government broad access to user data and intellectual property. This creates unique challenges when considering the use of these AI systems by international users, particularly for processing sensitive or proprietary information," he said. "The technology sector needs frameworks that ensure all AI systems protect user privacy and intellectual property rights according to international standards, while recognizing the different data access and governance requirements that exist across jurisdictions."

Caitlin Andrews is a staff writer for the IAPP.