Dear privacy pros,

On 3 Oct., the Online Safety (Miscellaneous Amendments) Bill was introduced and read before the Singapore Parliament. The bill would give the Infocomm Media Development Authority greater powers over the regulation of content on social media platforms, including the power to issue directions to block access to "egregious content," or to require such content to be taken down.

Egregious content includes posts advocating suicide or self-harm, content depicting child sexual exploitation or sexual violence, materials relating to terrorism and materials that may incite racial or religious tensions or pose a risk to public health in Singapore.

The proposed bill would introduce, under the Broadcasting Act, a Code of Practice for Online Safety that would require providers of online communication services with "significant reach or impact" to establish systems and processes that would prevent people in Singapore, particularly children under the age of 18, from accessing harmful content. Such measures may include tools to allow children or parents to manage their safety while using these services, or the implementation of content moderation in accordance with established community standards or guidelines.

This is a timely move, given similar shifts toward the imposition of greater oversight over and potential liability on social media platforms for the content they host. In the U.K., a court recently ruled the death of 14-year-old Molly Rose Russell resulted from "an act of self-harm while suffering from depression and the negative effects of online content," according to the senior coroner, Andrew Walker. A U.K. Online Safety Bill making its way through Parliament was partly galvanized by Russell’s death.

According to the inquest, Russell viewed thousands of images on Instagram and Pinterest relating to suicide, self-harm and depression in the six months before she took her own life. Walker concluded the online content Russell viewed "affected her mental health in a negative way and contributed to her death in a more than minimal way."

There is a difference between intermediaries who are “mere conduits” for user-generated content and service providers who play a more active role in organizing content, as most social media platforms do. Algorithms employed by the companies tailor content to individual viewers based on the level of engagement.

However, it is also clear that social media companies cannot effectively police every piece of content on their platforms. There might be cases that warrant further disambiguation beyond what the content moderation team can be expected to perform. Singapore’s Online Safety (Miscellaneous Amendments) Bill will hopefully strike the right balance between holding social media companies accountable for their content moderation regimes while allowing some flexibility to deal with material that may have been missed despite best efforts, such as the notice-and-take-down mechanism.