Critics of Apple’s policy to keep its criteria for third-party applications close to the vest were assuaged last week when the company revealed its guidelines. The App Store Review Guidelines include provisions on trademarks, data aggregation, user interfaces, violence, pornography and privacy. Apple published the rules in order to help developers “steer clear of issues as they develop apps” and to be sure users have a quality experience with the company’s products, according to the published guidelines.
Apple’s platform and Google’s answer to it, the Android, allow millions of developers to take their app, or end-user software applications, to the marketplace. But the types of personally identifiable data these apps may collect is immeasurable, some privacy advocates say, which raises significant challenges about how best to ensure appropriate practices are followed for managing that data.
The Pew Internet and American Life Project this week released a survey that found that of the 84 percent of American adults who have a cell phone, 43 percent have apps.
“An apps culture is clearly emerging among some cell phone users, particularly men and young adults,” said Kristen Purcell of Pew’s Internet Project.
“Apps are becoming a primary way that users get what they need,” says Jules Polonetsky, CIPP, of the Future of Privacy Forum. “There are many issues that are going to develop that are going to call for continued, responsible development.”
But as apps grow in numbers and usage, it remains to be seen how that will affect consumer privacy on matters of data collection, use and retention.
Apple’s privacy guidelines state that apps must require user consent before transmitting data about the user, that permission is acquired before user information can be transmitted and that users are informed about the use of their data. Apps that require personal information such as e-mail address or birth date will be rejected, the guidelines say, as will apps that target minors for data collection.
Polonetsky says the way app developers deal with customer data collected via their software is likely to become one of the leading privacy issues in the next couple years. That’s because the millions of app developers around the world, he says, range from teenagers in basements to billion-dollar companies, all with varying levels of experience and knowledge about data protection, despite the fact that their app may accumulate rich caches of customer data.
But if platforms publish guidelines, including privacy policies, such as Apple did, will that be sufficient to protect consumer privacy?
“It’s certainly critical for the Apple store to provide education, but it’s more important for industry to accelerate its efforts to work on getting information to the app developers, so they can, on their own, understand their legal and policy requirements,” Polonetsky said. “I think what’s important is that it’s really a far bigger picture than Apple.”
The bigger picture, Polonetky says, one that industry and developers alike must pay attention to, is that smart phones are capable of performing limitless tasks for countless purposes, including data intensive ones such as billing and health services. Such apps already exist on the market, including the pregnancy tracker, the expense tracker and stress test apps.
An SMobile Systems report that focused on the Android platform, Google’s answer to Apple’s iOS platform, found that 68 percent of applications available for download in the Android market collected metadata. In addition, 383 applications were found to have the ability to read or use the authentication credentials from another service or application.
“The fact remains that there is no means available for a user to know for sure that the app they just downloaded is doing only what the user sees it doing,” the SMobile report says. “An attacker would not need to spend months…trying to find undiscovered vulnerabilities when all they would need to do is write a simple application that they are certain most consumers would willingly install on their own, in order to obtain the…data.”
That said, where liability lies is up for debate.
“Unfortunately what we’ve got is different levels of finger pointing, where the app providers say the platform provider or consumer should be the ultimate bearer of risk, and some platform providers say the app provider should take it all on,” says Chris Kelly, former chief privacy officer at Facebook.
Polonetsky says industry must launch educational efforts for developers and provide guidelines on how to mitigate risks.
“The solution needs to be a balance of platforms that set a baseline and legal or best practices that cover a range of potential activity,” Polonetsky said. “The app platforms and carriers can play a leading role, but the app community, which is yet very immature, needs to engage and step up to get ahead of potential concerns,” Polonetsky said.
Kelly agrees that app platforms like Apple’s and Google’s aren’t the only parties that must be vigilant about consumer rights. He said the onus is on every link in the food chain.
“From the consumer, who has the duty to respond to only reasonable offers and apps, to the platform having standards about what apps are allowed, to the app providers themselves, who should be the primarily responsible party.”
To that end, The Future of Privacy Forum is now developing an information portal for app developers who otherwise likely have no access to data protection guidance, and The Center for Democracy and Technology is working on a set of best practices for apps.
Jonathan Zittrain, author of
The Future of the Internet and How to Stop It
and a law and computer science professor at Harvard Law School, says he’d like to see different platforms try different standards in an effort to find best practices.
“The ones that hit the sweet spot could transcend a single vendor and become a de facto Web or Internet-wide standard,” Zittrain said.
Regarding Apple’s guidelines, Kelly says standards are helpful, particularly when they’re published, and that most platforms do run to standards. He says there’s a reasonable role to be played by regulatory authorities in terms of assessing platforms and their commitment to consumer protection, and in extreme cases of consumer deception, perhaps there is room for legal intervention. But most important are the standards themselves.
“Clarity in the standards that are applied should be the requirement,” Kelly says.