Companies are able to collect and process more data than ever before. With that comes the opportunity for research and testing at unprecedented and potentially life-saving levels. Such research could be for everything from relatively harmless A-B testing for discovering how the color of a company’s logo translates to better user engagement to more murky areas intending to develop solutions for the social good, such as finding a cure for cancer.
There are growing calls from the research community to access the giant and potentially priceless troves of consumer data collected daily by tech giants like Facebook, Twitter, Google, and LinkedIn, in a trend called data philanthropy.
But what are the ethical and privacy implications of conducting research on user data, and how should companies go about making such decisions?
This was the topic explored Thursday by Facebook CPO Rob Sherman, LinkedIn Head of Global Privacy Kalinda Raina, and IAPP VP of Research and Education Omer Tene here at the Privacy. Security. Risk. conference in San Jose, California.
Bringing the topic into focus, Tene cited last year’s fallout from the so-called “emotion-contagion” study conducted by Facebook. The incident, in which Facebook determined to find out whether positive and negative posts could affect other users’ feelings of positivity or negativity, caused a media uproar, and brought into question the ethical standards applied to user data by tech companies.
“The incident forced us to mature our thinking, to mature our framework when conducting research,” said Facebook's Sherman. He explained Facebook has always had an internal decision-making process for this, and that, now, all research is subject to an expert review process, analyzing the research, reasons for conducting it, and whether its done so ethically.
While at Apple, her previous employer, Raina said there was lots of thinking about research when creating new products and its use of consumer data. The company realized there could be viable data for medical research that could provide better insight into diseases like cancer.
“At Apple, in the medical space, we looked closely at whether it would hold up to research standards and whether it works with the company’s standards,” she said.
“Researchers are clamoring for more data philanthropy and companies are expected to share the wealth,” Tene pointed out. So what should privacy pros do?
Raina said there are “real benefits companies can strive to achieve with the data they have,” but, for one, companies need to look at their own values. She stressed that companies need to be transparent with how they use data for research. “We would not invite outside researchers,” she noted. Rather, Raina said companies should provide researchers with tools – like applications – to reach out individually to users to get their permission. “There are ways companies can facilitate these opportunities,” she added.
Sherman, in part, agreed, but said Facebook thinks a lot about providing solutions for what he called "the social good." He said, for Facebook, the main value is to connect people. “We’re increasingly thinking about ways to develop our technology,” he said. “We can do things with technology to provide value to people around the globe.”
For example, he pointed to Safety Check. This feature uses crowd-sourced, aggregated data to locate spikes in trends that indicate a potential natural, or human, disaster. Users in an area in question can let friends and family know they're okay in cases of, say, earthquakes or terrorist attacks.
In another example, Sherman pointed toward aggregated population density mapping. This data set can help aid workers bring food and medical supplies to vulnerable communities. He said it's a solution that produces social value without compromising the privacy of individual people.
True, notions around privacy expectations have changed over time. The introduction by Facebook 10 years ago of its News Feed feature - a standard in social media in 2016 - created a serious privacy controversy. Of course, 10 years later, news feeds are a basic tenet of social media platforms. Over time, users became more comfortable with what was once perceived as a creepy feature.
But, in the bigger picture, should societal benefits outweigh the need to obtain consent for potential use of, albeit aggregated, consumer data?
For Raina, the answer is essentially no. "Who is in the position to make that call?" she said. "I don't think any of our companies are in the position for that." To her, the solution resides in providing tools for researchers.
Ultimately, are companies really in the position to make such ethically based calls when traditional notions of the corporate framework rely upon shareholder value? And are ethical review processes even appropriate for companies, Tene asked.
"Companies should consider ethics and evaluate what's being done with data," Raina said, adding companies need to be clear with their users. "As privacy pros, we must shape companies' value systems."
If you want to comment on this post, you need to login.