New Workshop for Professionals Focused on Trust and Safety, and Content Moderation

Registration is now closed.

The IAPP is exploring ways to support professionals working in the trust and safety and  moderation fields. Professionals doing this work have much in common with privacy teams. Both are tasked with applying policies and principles in complex technological settings, while also ensuring procedural due process, transparency, accountability and individual recourse.

Scheduled alongside the IAPP Global Privacy Summit 2019, the Content Moderation in 2019 workshop in Washington will address organizational, operational and technological challenges and emerging best practices in this field. Speakers will feature leaders from online platforms, social networks, sharing economy apps and more.

When
Wednesday, May 1, 2019

Where
Washington Marriott Marquis Hotel
901 Massachusetts Ave. NW
Washington, DC 20001

Pricing
Industry: $495
Non-profit/Government/Academic: $195

 

Keynote Speakers

 
 

Tarleton Gillespie

Microsoft Research, Author of 'Custodians of the Internet'

 

Tarleton Gillespie is a principal researcher at Microsoft Research New England, and an affiliated associate professor in the Department of Communication and Department of Information Science at Cornell University. His new book, “Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions that Shape Social Media” (Yale University Press) was published in June 2018. He is also the author of “Wired Shut: Copyright and the Shape of Digital Culture” (MIT Press, 2007).

 

Nicole Wong

Former Deputy Chief Technology Officer of the United States; VP and Deputy General Counsel, Google

 

Nicole Wong specializes in assisting high-growth technology companies to develop international privacy and regulatory strategies. She previously served as deputy U.S. chief technology officer in the Obama Administration, focused on internet, privacy and innovation policy. Prior to her time in government, Wong was Google’s vice president and deputy general counsel, and Twitter’s legal director for products. She frequently speaks on issues related to law and technology, including five appearances before the U.S. Congress. Wong chairs the board of Friends of Global Voices, a non-profit organization dedicated to supporting citizen and online media projects globally. She also sits on the boards of WITNESS, an organization supporting the use of video to advance human rights, and the Mozilla Foundation, which promotes the open internet. Wong currently serves as an advisor to the AI Now Institute, the Alliance for Securing Democracy, Luminate, Refactor Capital and the Albright Stonebridge Group.

 

 

Schedule and Programme

Wednesday, May 1

  • content_paste 8:30 - 9:30 a.m. Registration and Breakfast
  • expand_more 9:30 – 10 a.m. Keynote Presentation

    Tarleton Gillespie, Microsoft Research, “Author of Custodians of the Internet”

  • expand_more 10 - 11 a.m. Defining the Profession

    Kevin Collins, Managing Director, Software & Platforms Industry Practice, Accenture
    Alex Feerst, Head of Legal, Medium
    Robert Lewington, Director of Global Moderation, Twitch

    Let’s explore how the field of content moderation is evolving, its importance within organizations and what can be learned from analogous professions such as privacy. We will dig into the critical questions, such as: Where does the content moderation function sit in an organization? Where does it report to? How many employees are on the team and where are they located? Which functions are involved, including legal, policy, PR, engineering and beyond?

  • expand_more 11 a.m. - 12 p.m. Operationalizing the Profession: Hiring, Training and Developing Processes 

    Denise Farnsworth, Deputy Data Protection Officer, Facebook, Instagram, Oculus, WhatsApp
    Becky Foley, Director of Content, TripAdvisor
    Corinne Pavlovic, Senior Director of Trust & Safety, Etsy
    Damien Vaught, Online Safety Manager, Microsoft

    What are some of the key hiring and training strategies for this work? How is the role of content moderation evolving and how does that impact the recruiting? What are some systemized approaches to content moderation? This session will delve into how companies are effectively recruiting and training for content moderation, trust and safety teams and explore possible frameworks related to the adjudication of online content.

  • local_dining 12 - 1 p.m. Networking Lunch
  • expand_more 1 - 1:30 p.m. Keynote Presentation

    Nicole Wong, Former Deputy Chief Technology Officer of the United States; VP and Deputy General Counsel, Google

  • expand_more 1:30 - 2:30 p.m. Reconciling Laws, Languages and Cultures

    Daphne Keller, Director of Intermediary Liability, The Stanford Center for Internet and Society
    Simon McDougall, CIPP/E, CIPM, CIPT, Executive Director, Technology Policy and Innovation, U.K. Information Commissioner’s Office
    Nicole Karlebach, Global Head, Business & Human Rights, Verizon Media

    As tech platforms continue to grow and their users become more multi-national, content moderation strategies must be deployed and optimally harmonized worldwide. With the increase in the spread and scope of misinformation, particularly during democratic elections, government involvement is also on the rise. New laws and regulations are being regularly introduced, such as bills in the U.S. focused on cutting down on sex-trafficking online, and an EU regulation requiring the removal of terrorist content within 24 hours. As a result, platforms are required to create processes that accommodate a plethora of laws, languages and cultures. This session will focus on how companies are addressing content moderation globally—what can be globalized and where national silos remain.

  • access_time 2:30 - 2:45 p.m. Networking Refreshment Break
  • expand_more 2:45 - 3:45 p.m. Machine Learning, AI and Humans

    Jason Coon, Head of Trust and Safety, Mixer.com
    Carlos Figueiredo, Director of Trust & Safety, Two Hat Security
    Adam Neufeld, VP of Innovation and Strategy, ADL

    To perform content moderation in real time and at scale, companies must implement and train machine learning and AI tools. As these tools improve in effectiveness, how is the division of labor between humans and machines shifting? What can technology solve and what remains beyond the reach of tech innovation? Who are the vendors in the space? During this session we will discuss the balance and effectiveness of the human-machine partnership in content moderation.

  • expand_more 3:45 - 4 p.m. Wrap-Up

    Eric Goldman, Professor of Law, Co-Director of the High Tech Law Institute, and Co-Supervisor of the Privacy Law Certificate, Santa Clara University School of Law

    What are the skills and skill gaps for content moderation professionals? What are the needs of this emerging profession in terms of people, processes and tools?

 

The Inflection Point

For the past two decades, policymakers, scholars and business leaders have struggled to define the role of online platforms as digital town squares, publishers, editors, content distributors, or something sui generis. Initially disinclined to manage user generated content given the broad protections offered by Section 230 of the Communication Decency Act, platforms have increasingly been subject to political, economic and ethical incentives to actively moderate users’ activities online. As the public debate about hate speech and fake news rages on, content moderation is at an inflection point. Public pressure has built for platforms to parse fact from fiction, ideology from opinion and perform content moderation at scale, in real time, across geographic, legal, language and cultural barriers spanning the globe.

Professionals focused on this work at companies ranging from social networks to the sharing economy, from online retailers to traditional media, are charged with devising, implementing and enforcing community values and to institute norms in digital ecosystems. Our inaugural workshop provides an opportunity for those working in the field of trust and safety and content moderation to come together to discuss what’s next for the profession.

 

Subscribe to the Content Moderation Digest

Interested in staying abreast of the industry news related to content moderation and trust and safety? Subscribe to our Content Moderation Digest. The digest tracks the ethical, political, and economic issues surrounding the monitoring of users’ activities online and the challenges companies face in sifting out illegal and offensive content, inauthentic accounts and harmful material.

Other Conferences

29 – 30 October
Sydney

Delivering world-class discussion and education on the top privacy issues in Australia, New Zealand and around the globe. Summit is your opportunity to network, get insights from, and exchange operational best practices with other ANZ privacy leaders.

November 7
New York City

Attend this one-day conference to receive practical, in-depth content from leading lawyers and consultants on CCPA compliance.

Training 18-19 November
Workshops 19 November
Main Conference 20-21 November
Brussels

Your source for European data protection policy debate, multi-level strategic thinking and thought-provoking discussion.  The Congress delivers thought leadership, a thriving professional community and unrivalled education.