Like parents shocked at exposés of their daughters’ partying for adult television cameras, regulators in 2012 made one disappointing discovery after another about mobile app privacy practices. Industry-wide, whether they are fun games, serious tools or educational resources, mobile apps continue to access, collect and use private data stored on smart devices while customers remain largely ignorant of and disempowered by these practices. Key reports issued this winter, coupled with recent enforcement actions, suggest that regulators are ready to insist that they and consumers no longer be subjected to these unpleasant revelations. The regulators’ plea in 2013 is: “No more surprises…or you’re grounded!”


Consumers Embrace Mobile Devices and Apps . . .


Nearly 90 percent of Americans own a wireless phone and nearly half own a smartphone device. Concomitantly, the number of devices that support mobile apps is increasing rapidly. The expansive popularity of anytime, anywhere consumer services apps is resulting in an exponential increase in the collection and sharing of device users’ personal data. There are one-and-half-million apps available today and another 1,600 new apps being published every day. Many mobile apps can capture and share text messages, voice mail, voice memos, call logs, geotagged photos, videos and music. This information, when combined with unique mobile device IDs and a user’s precise geolocation history, can recreate rich, minute-by-minute profiles of users’ online and offline lives.


. . . But Do Not Want to be Profiled or Monitored


Notwithstanding these grand data collection capabilities, recent studies suggest that consumers do not expect or wish to be monitored and profiled by their smart devices or by the apps loaded on them. A September 2012 Pew Research Center study reveals that 54 percent of app users decided not to install an app when they learned how much personal information would be collected by the app. And, 30 percent of app users uninstalled an app after learning that it was collecting more personal information than they wished to share. A report by the Berkeley Center for Law & Technology found a yawning gap between industry data collection and sharing practices and consumer privacy expectations in mobile media. In fact, 78 percent of Americans consider information on their mobile devices to be at least as private as data stored on their home computers. A whopping 81 percent of the respondents would either definitely or probably not want to share social media contact lists stored on their devices in order to obtain more connections on a social media service. Even more so, 93 percent would not choose to share their personal contacts in order for a coupons app to send coupons to those people. Another Berkeley Center report finds that a majority of respondents believe the slogan “Do Not Track” effectively means “do not collect information that allows companies to track them across the Internet.” The study did not target mobile apps, but the consumer sentiment is entirely relevant to data collection by mobile apps.

Industry Initiatives in Mobile Privacy Guidelines


The contrast between consumer expectations and the various practices frowned upon by regulators is not due to a lack of available industry guidance. For starters, the CTIA (The Wireless Association) has for some time offered guidelines for notice and consent in the deployment of location-based mobile services. The GSM Association, a trade group representing the mobile industry, released “Privacy Guidelines for Mobile Application Development,” which address the bulk of the notice, consent, transparency and control issues that trouble regulators, in February 2012. The Mobile Marketing Association has also published a framework for the development of privacy disclosure statements in mobile apps. More recently, ACT, an organization representing small and mid-sized app developers and other technology firms, proposed a set of icons for use in disclosing privacy features in mobile apps. And, since late 2011, a best practices guideline has been available that sets out recommendations which are in substance quite close to the checklist recently propounded by the California AG. TRUSTe, a leading provider of website privacy certification seals, offers a fast track program for mobile apps to easily comply with regulatory expectations. Finally, the speediest offering we have seen is the PrivacyChoice service that advertises a “state-of-the-art privacy statement for your app or site in about 10 minutes” through the use of an online wizard.

Mind the Gap: Regulators Focus on the Disconnect Between Mobile App Practices and Consumer Expectations

Despite the numerous industry initiatives, there is an obvious overenthusiasm for customer data that has outstripped any industry or regulatory measures to promote transparent privacy practices and clear choices for consumers. This widening gap is the subject of several FTC reports and statements by the California AG.

The Federal Trade Commission (FTC) has honed in on mobile apps for children and published its findings in two reports issued during 2012. They are “Mobile Apps for Kids: Current Privacy Disclosures are Disappointing” and “Mobile Apps for Kids: Disclosures Still Not Making the Grade.” The FTC has also hosted public workshops addressing advertising and payments issues in the “mobile ecosystem” and released a major whitepaper on consumer privacy. It has even published a high-level guidance for marketing mobile apps consistent with basic privacy principles previously stressed by the commission.

On the other side of the nation, the California Attorney General joins the regulatory triangulation on the mobile apps industry with “Privacy on the Go: Recommendations for the Mobile Ecosystem.” This is the most comprehensive privacy guidance to date for mobile media issued by any state. While not law, the recommendations lay down a baseline compliance checklist for all mobile businesses subject to California’s Online Privacy Protection Act (CalOPPA). They will be an important reference point in ongoing industry and government efforts to establish national mobile privacy standards.

All of this guidance is a response to the agencies’ findings that the actual practices of mobile apps are quite concerning from the perspective of consumer privacy. In the second “Mobile Apps for Kids” report, FTC staff reviewed a random selection of 400 kids’ apps from the Apple Store and Google Play and exposed some stark patterns:

  • Only 20 percent of reviewed apps contained any privacy-related disclosure on the app’s promotion page, developer’s website or within the app itself.

  • Privacy disclosures that did exist were long, technical, lacking useful details, rife with irrelevant details or ambiguous.

  • 56 percent of reviewed apps transmitted the user’s device ID to ad networks, analytics companies or other third parties. (The report highlights that unique device IDs facilitate the aggregation of other types of personal information related to the same device, not just from the downloaded app, but from other sources, including other apps, thereby facilitating data-rich profiling of individuals.)

  • Some third parties' ad networks and analytics companies receive data from many apps, thus gaining the potential to create detailed profiling through aggregation of data linked to unique device IDs.

  • Whereas only nine percent of reviewed apps disclosed the presence of in-app advertising, in reality 58 percent actually contained ads.

  • Less than half of the apps with social media links actually disclosed the presence of such links.


The report concludes that the revealed practices may constitute violations of the Children’s Online Privacy Protection Act (COPPA) or the FTC Act’s prohibitions against unfair and deceptive trade practices. While restrained in tone, the report resonates with disappointment at children’s mobile app developers.


As it studies the industry, the FTC has also been actively pursuing specific investigations and enforcement. In September 2011, the FTC entered into its first consent decree with a mobile app provider, W3 Innovations, which sold popular kids’ apps without complying with any of COPPA’s requirements. That same month, the FTC signed consent agreements with the makers of AcneApp based on their unfounded health claims of healing acne with colored lights emitted from smartphones. In October 2011, the FTC settled a case with a peer-to-peer file-sharing app developer, Frostwire LLC, based on its apps which misled consumers about the extent to which their personal files would be shared with—exposed to a millions-person network with barely any notice.


Key Recommendations and Enforcement Steps by the California AG’s Office

Across the nation, the California AG’s Office’s efforts parallel the FTC’s move from study and guidance towards enforcement. In early 2012, the office published a “Joint Statement of Principles” with major mobile application platform providers—Amazon, Apple, Google, Hewlett-Packard, Microsoft and Research in Motion. The Joint Statement was a shot across the bow to large players in the mobile apps industry, reminding them that CalOPPA, which requires operators of commercial websites to have compliant privacy disclosures, applies equally to mobile e-commerce. In June 2012, Facebook joined as a co-signatory to the Joint Statement. Facebook also participated in a multi-stakeholder advisory group on mobile privacy practices led by the AG’s office and the California Department of Office of Privacy Protection.

In October 2012, the office furthered its stance on privacy with notification letters to approximately 100 mobile app makers that they were not in compliance with CalOPPA and would be given 30 days to respond or comply. Despite acknowledging receipt of and intended compliance with this letter, Delta Airlines did not correct its Fly Delta app’s privacy deficiencies. As of December, its mobile app still did not include an app-specific privacy policy addressing the various form of PII that it collected, including credit card and geolocation information. The office immediately filed a complaint against Delta, requesting damages of $2,500 per violation, i.e. download of the app, as decreed by CalOPPA.

On a broader scale, the office sees a rosier view than the FTC. The Joint Statement industry signatories state that they have implemented the joint principles. From September 2011 to June 2012, the number of free Apple Store apps with a privacy policy doubled, from 40 percent to 84 percent, and those in Google Play rose from 70 percent to 76 percent. Now, Attorney General Kamala Harris has encouraged the industry further by publishing the detailed privacy recommendations to all players in the “mobile ecosystem”—app developers, app platform providers, online advertising networks, operating system developers and carriers.

The thrust of the recommendations is to encourage the mobile industry to build fair information practice principles (FIPPs) into the design of mobile apps, devices and services, and to enhance transparency in privacy disclosures and simplicity in user privacy controls. Its overarching goal is “surprise minimization”—reducing instances where consumers are subject to unexpected or undisclosed collection and sharing of their personal information, particularly information that is not required for an app’s basic functions. For instance, a user might be unpleasantly surprised to learn than a birding app also harvests the user’s personal contacts or call logs. The notion of limiting mobile data collection to uses “reasonably expected” by users in the context of an app’s purported functions is central to the recommendations.

The AG specifically recommends that mobile app developers undertake the following:


  • Identify, identify, identify.Developers must be clear on what their app is doing. What are all the types of personal data that an app will collect or disclose to third parties, including third party software used in an app? This includes, but is not limited to, unique device identifiers, geolocation data, mobile phone numbers, text messages, call logs, financial payment information, health and medical information, photos and videos, browsing and download histories.



  • Check it off.The recommendations and other sources provide checklists designed to smoke out data uses that may conflict with FIPPs or applicable privacy laws. A developer needs to be able to answer checklist questions for each data type or category, creating a thorough matrix of an app’s data practices. For example:

    • Is the data is necessary for the app’s basic functionality or related business purposes such as billing?

    • With whom will the data be shared?

    • For what purpose will the data be shared?

    • How will third parties use the shared data?

    • What kind of access does the developer have to an app purchaser’s device?

    • Can the device owner modify these permissions?





  • Make some decisions.After considering its matrix of data needs, wants and excesses, app developers should establish their privacy practices. The recommendations cite several of the familiar FIPPs, such as transparency of data practices, limited data collection and retention, easily-read and easily-accessible privacy policies, means for consumers to access personal data collected and retained by the app, reasonable security measures including encryption of personal data in transit and designation of responsible persons in the organization to maintain and update privacy policies.



  • Think small.Given the screen constraints of mobile devices, the recommendations encourage developers to consider various methods that would be best suited to the app environment—icons, layered notices, grids, labels and dashboards for user-controlled privacy choices.



  • Go above and beyond.For apps that collect sensitive information; i.e., financial or health information, or personally identifiable information that is not related to app functionality, the recommendations suggest enhanced notification measures. “Special notices” would be short notices, delivered in real time, before data is collected, informing consumers of the impending collection of their information.


Some other specific guidance “tips” for privacy practices by app makers include:

  • Use app-specific or other non-persistent device identifiers when collecting data.

  • Give users control over the collection of any personal data that is not needed for the functioning of the app.

  • Offer default settings for controls which are privacy protective.


Yes, You Too: Recommendations for App Platform Providers, Ad Networks and Other Market Participants

A resounding theme of the regulators is the necessity and interdependence of all mobile ecosystem participants in effecting adequate privacy protections for consumers. The recommendations emphasize that app platforms, mobile advertising networks, mobile operating system developers and mobile carriers need to play a role in protecting consumer privacy. Platform providers can help consumers access an app’s privacy disclosures before they download the app, supply the means for users to report complaints or ask questions about purchased apps, and further consumer education on mobile privacy. Mobile ad networks should provide their own privacy policies to developers who deliver their ads so that the apps can link to that policy in a pre-purchase notification. Operating system developers and mobile carriers should work together, among other purposes, to develop cross-platform standards for privacy controls and security patches.


Conclusion: It’s Industry’s Turn

Despite all the chastising, regulators are employing great efforts to support the app industry with clear and implementable guidance on consumer privacy. They do not, however, sufficiently address the business model of mobile apps, which like the Internet, is driven by advertising rather than pay-for-play. The recommendations give a nod to this economic reality by focusing on “surprise minimization” and “special notices” rather than insisting on actual minimization of data collection and use. Although it is not technically difficult to include app-specific privacy policies and user notifications, all the participants need to give more thought to how the industry can adapt its business paradigm to incorporate fair use principles. Given the abundance of tools and principles, developers and other industry players are sufficiently equipped to begin this process and thus hopefully avoid the external imposition of such a paradigm shift. The guidance and goals are there; it’s time for industry to proactively incorporate privacy into its business model.