In the shadow of Russian intelligence officers surreptitiously using digital advertising tools in the 2016 election, both regulators and online platforms acknowledge that technology creates new challenges to the integrity of elections. Existing election laws — which generally prohibit the use of foreign money in political advertising and require advertising sponsor disclosures — often only regulate broadcast communications. Existing election protection focuses on the integrity of paper ballots and physical machines and not the information about candidates, campaigns, or elections. As Silicon Valley giants race to self-regulate, they encounter novel privacy and security challenges.
Google, Facebook and Twitter have all recently announced the steps that they will take to protect integrity and promote transparency in election-related activity on their platforms. These policies include steps to verify the identity of individuals and entities that purchase political advertising as well as required in-ad disclosures regarding the sponsors. Facebook and Google have also taken steps to create online repositories of advertisements with political content.
Both online platforms and legislators across the country are grappling with, and not finding easy answers to, a host of questions:
- Which security tools should be used to verify identity and maintain the integrity of accounts on a platform?
- How does a platform identify a political advertisement for the purpose of imposing controls?
- How can online platforms protect privacy while providing transparency regarding purchasers of political ads?
Applying controls to political advertisements
Online platforms face a new political advertising challenge: Elections, and election law, are local, whereas online platforms are global. In the United States, there are 536 holders of federal office, more than 7,000 state legislative districts, and almost 20,000 municipal governments (not to mention one elected dogcatcher position). For each of these elections, an online platform must recognize the candidates, the key issues, and the election date in order accurately identify political ads. Traditional media — such as television, radio stations, and newspapers — distribute regionally and have not faced these issues.
Online platforms seeking to implement “one-size fits all” policies face rapidly changing rules. Legislation has been introduced at both the federal and state level to address several gaps in existing law, specifically:
- Expanding regulations to cover digital content.
- Addressing “issue-based” advertising.
- Considering whether the purchaser or distributor of political advertising should bear the compliance burden.
The first such law, Washington’s HB2988 Act, targets the use of “dark money” and expands reporting requirements. A new law in Maryland targets online advertising directly, and new election advertising legislation has been introduced in California, Connecticut and New York, among other states. These laws expand definitions of election advertising and impose affirmative disclosure requirements on entities accepting advertisements to create and maintain searchable online databases about who paid for ads. The Federal Election Commission has also started a rulemaking exercise on a proposal to extend existing rules to “online political advertisements.”
While waiting for new laws, online advertisers may face liability under existing law. In June 2018, the Washington State Attorney General sued Facebook and Google for violating state election law — Washington law RCW 42.17A — since 2013 by failing to keep required records regarding political advertising and failing to make these records available for public inspection. The law contemplates traditional platforms such as television and radio stations, but the definition is written broadly and may encompass online platforms as well.
While the exact contours of the regulatory landscape remain undefined, it is clear that online platforms will have some legal responsibility to ensure adequate disclaimers and keep records in a publicly available format. However, this is not so easy to do, given the lack of a uniform definition as to what constitutes “political advertising.” Most existing laws, including FEC regulations, tie the advertising to a “clearly identified” individual who is a candidate. The new Maryland regulation defines political advertising as “relat[ing] to a candidate, a prospective candidate, or the approval or rejection of a question or prospective question, and is published, distributed, or disseminated.” Under this standard, it is possible that an ad about an issue that is being hotly debated in the election but is not actually a referendum could be covered.
The standard is so difficult to put into practice that Google took the step of suspending acceptance of political advertising for state and local elections in Maryland, noting that its systems are not set up to comply with the new law.
Of the three online platforms that have introduced new political advertising policy, only Facebook has attempted to put controls around elections other than a federal election. Facebook’s definition of political advertising, broader even that Maryland’s, includes advertisements “made by, on behalf of or about a current or former candidate for public office, a political party, a political action committee," that “advocates for the outcome of an election to public office,” that “[r]elates to any election, referendum or ballot initiative, including "get out the vote" or election information campaigns,” “[r]elates to any national legislative issue of public importance,” or [i]s regulated as political advertising.” Google’s policy applies only to federal candidates, though Google’s general counsel, Kent Walker, acknowledged recently that Google may expand their “coverage to a wider range of elections.” Twitter similarly defines political advertising in the U.S. as purchased by an entity or candidate registered with the FEC advocating for or against running in a federal election.
The Digital Advertising Alliance (a leading industry association) has created a “PoliticalAd” icon that should link to: the name of the political advertiser; reliable contact information for the advertiser; other information required by applicable federal or state law for such notices; and the name(s) of the advertiser’s CEO, executive committee, board of directors, or treasurer.
Privacy
Increased transparency is a key aim of both election laws and the new online platform policies. Transparency, however, may have a privacy cost. Facebook and Google have not yet announced the format that information available in its online political advertising archives will take. However, the online format and fact that a broad array of advertisements may be covered may result in a practical effect of increased sharing. Moreover, as Facebook and Google think about protection of elections beyond U.S. borders, global data privacy laws may block transparency.
Efforts to screen user accounts and sponsored content for compliance may collide with privacy concerns about how online platforms utilize user data. Indeed, efforts by regulators to restrict data use in the wake of recent scandals may fail to realize that data sharing can be valuable to the election protection effort. Facebook recently launched an initiative to partner with scholars to study the impact of social media on elections, stating that: “Facebook and our funding partners recognize the threat presented by the recent misuse of Facebook data [but] we believe strongly that the public interest is best served when independent researchers have access to information.”
Security
The primary focus of the online platforms has been on ensuring that individuals who purchase election ads are who they say they are. The new verification processes all require a copy of a government-issued ID (for organizations, this must be done by the individual who controls the ad-purchasing account). While a sensible practice to confirm identity, highly sensitive data is now traveling to and stored in the companies’ systems, which in turn raises new concerns about how the platforms are secured and how the data might be used or shared.
Google appears to be the only large internet advertising presence that explicitly links election advertising reforms with data security. Google has introduced a Protect Your Information with Google initiative, which is made up of three primary parts. The first part is their Advanced Protection Program (API), which aims to protect the integrity of Google accounts of targeted individuals and teams by requiring a physical Security Key, in addition to a password, to gain access. API will also aim to protect the user’s privacy by limiting access to the data generated by user accounts. If successful, this would enhance the privacy controls on these accounts and also improve their data security. The second part of the initiative is Project Shield, which would protect from DDOS attacks. The third part of the initiative is Perspective API, which aims to reduce harassment on online commenting platforms.
Further guidance from the federal government has been issued. On July 20, U.S. Attorney General Rod Rosenstein announced the release of the Cyber-Digital Task Force Report. The report addresses a wide range of cyber threats, including election interference. The report unfortunately does not provide much guidance on election advertising laws, other than references to the Foreign Agents Registration Act and Federal Communications Commission regulations. However, it does describe how the Department of Justice will cooperate with the Department of Homeland Security in its election security mission and with social media providers’ voluntary efforts. Moreover, the report describes the DoJ will investigate and disrupt state-sponsored hacking schemes related to the U.S. elections. While the framework is still evolving, the report describes an interesting new federal security and enforcement regime for election data security and election advertising laws.
Between self-regulation, new legislation, and extension of existing rules, online platforms can no longer rely on individuals to self-comply while placing ads. The likely direction is a “lowest common denominator” strategy, where online platforms impose policies to embrace the broadest election laws. But even where platforms can develop an implementable definition of” political advertising,” they will have to continue to develop security and content controls to ensure adequate protection of election activity on their platforms.