Facebook’s announcement yesterday of newly adopted guidelines and a review board for data research projects is the latest evidence of a new privacy truth: Legal compliance and sound security practices are not sufficient to meet societal expectations. With companies becoming laboratories for big data research, data ethics have become a critical component of corporate governance frameworks. Companies cannot view privacy as a compliance matter to be addressed by legal departments or a technical issue handled by IT. Rather, to avert public embarrassment and consumer backlash, they must consider ethical review processes and instill issue-spotting skills in employees throughout the organization.

Facebook’s move was driven by the criticism leveled against its experiment on user feelings by tweaking their News Feeds. Facebook did not break any laws. In fact, it has shown that it complied with its data use policy. Data security was not breached. And yet users, privacy advocates and the media were up in arms. As a market innovator – and a company whose very business is individual data – Facebook blazes a trail that companies in every industry will soon follow: creating internal standards for Consider a story recently published in Vox about a researcher with a PhD in biology who underwent genetic testing using the 23andme kit. One of the features of the service, enabled through customers’ opt-in consent, is a social media-type function that connects people based on the results of their genomic tests, including people who are directly related to them. Despite his expertise in biology, the researcher did not expect that one of the discoveries from the test would be a half-brother, sired by his father in secret. This led to family turmoil, culminating in his parents’ divorce. Internal review of the social (and familial) implications of the data matching service may have averted this tangible privacy harm. The result of such review need not have been disabling the feature. But perhaps users could have been viscerally alerted about its potential ramifications.

The line between wholesome and untoward practices is fickle. Even as Facebook was being castigated for its research, OK Cupid’s co-founder Christian Rudder proudly best selling book to describe his big data research. Indeed, Facebook itself has engaged in user data based experimentation in the past, boosting the proportion of organ donations among users by adjusting their profile settings to allow them to publicly display their status as organ donors. It is difficult to conjecture why one experiment is received more favorably than another. Perhaps users of online dating sites are more experimental in nature or cavalier about privacy; or maybe the information they post is not as accurate as that posted on Facebook.

The lesson is clear: To navigate the treacherous waters of big data practices, companies must develop internal competency for ethical review. To do so, they must train personnel to spot privacy issues and flag concerns. Managing privacy and ethical data practices within an organization entails becoming steeped in a growing interdisciplinary body of knowledge, and maintaining a firm grasp of new developments in technology, business and law. Sound data management practices are not common knowledge. They require extensive training, continuous education and a verifiable method of certifying skills.

Indeed, just as qualified civil engineers build bridges and certified dentists perform root canals, so too should ethical data management be entrusted to duly qualified, adequately trained professionals who have the skillset needed to make decisions that impact the lives of individual consumers.