The Indian Ministry of Electronics and Information Technology released the Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021 Feb. 25, 2021. The rules replace the original rules from 2011 and are promulgated under the Information Technology Act, 2000.
The rules place new obligations on companies that operate in India, requiring companies to create new positions on the ground, add specific terms in company policies, adopt proactive data removal tools and implement new notification procedures.
The majority of new obligations entered into effect Feb. 25, the day the rules were published in the Gazette. However, the due diligence for significant social media intermediaries will come into effect May 25, 2021.
The rules have several parts, including sections laying out due diligence requirements for intermediaries and a code of ethics for digital media. This piece focuses on the new due diligence requirements for all types of intermediaries, or Parts I and II of the rules.
Purpose for the rules
In a Feb. 25 news release, the MeitY stated the rules are intended to combat harmful content, including the "persistent spread of fake news" and the "rampant abuse of social media to share" revenge porn and "morphed images of women" online. The news release also points to concerns regarding an increase in harmful language and "blatant disrespect to religious sentiments" shared on platforms. MeitY states law enforcement has seen an increase in the use of social media for criminal pursuits. The rules are designed to "empower ordinary users" and hold platforms accountable.
Who must follow the rules
The rules apply to and define three types of intermediaries: intermediaries, social media intermediaries and significant social media intermediaries.
Under the IT Act, an intermediary, "with respect to any particular electronic records, means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record." The definition encompasses "telecom service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online-market places and cyber cafes."
The term social media intermediary means "an intermediary which primarily or solely enables online interaction between two or more users and allows them to create, upload, share, disseminate, modify or access information using its services."
The rules also distinguish between smaller social media intermediaries and significant social media intermediaries based on the number of users. Significant social media intermediaries are social media intermediaries that have more than 5 million users in India.
The inclusion of social media intermediaries means the rules will cover global and U.S.-based social media companies that have widespread adoption in India, such as WhatsApp (530 million Indian users), YouTube (nearly 450 million Indian users) and Facebook (410 million Indian users).
Breakdown of selected due diligence for intermediaries
Entities that fall into one of the above categories must conduct certain due diligence activities.
What an intermediary must do
An intermediary must publish its "rules and regulations, privacy policy and user agreement" on its website or mobile application. The intermediary should, at least once a year, inform users if they do not comply with the published rules and regulations, privacy policy or user agreement, the intermediary has the "right to terminate" the user's access or usage and "remove noncompliant information." Additionally, the intermediary must notify users of any changes to its rules and regulations, privacy policy or user agreement changes at least once a year.
When an intermediary collects registration information about a user, it must retain that registration information for 180 days even after the user has withdrawn or canceled the registration.
Furthermore, intermediaries have 72 hours to respond to a written government order that requests one of the four types of information: (1) information the intermediary controls or possesses; (2) information to assist the government in cybersecurity or protective investigations; (3) information about identity verification; or (4) information for the "prevention, detection, investigation, or prosecution, of offences under any law… or for cyber security incidents." If an intermediary discovers a cybersecurity incident, it must report it to the Computer Emergency Response Team, which is a functional organization of MeitY.
What an intermediary must mandate to users
The rules specifically identify 10 things all intermediaries must include in their rules and regulations, privacy policy, or user agreement. Essentially, users will not be allowed to "host, display, upload, modify, publish, transmit, store, update or share any information that:
- Belongs to another person and to which the user does not have any right; is defamatory, obscene, pornographic, pedophilic, invasive of another's privacy, including bodily privacy, insulting or harassing on the basis of gender, libelous, racially or ethnically objectionable, relating or encouraging money laundering or gambling, or otherwise inconsistent with or contrary to the laws in force;
- Is harmful to child.
- Infringes any patent, trademark, copyright or other proprietary rights.
- Violates any law for the time being in force.
- Deceives or misleads the addressee about the origin of the message or knowingly and intentionally communicates any information which is patently false or misleading in nature but may reasonably be perceived as a fact.
- Impersonates another person.
- Threatens the unity, integrity, defense, security or sovereignty of India, friendly relations with foreign States, or public order, or causes incitement to the commission of any cognizable offence or prevents investigation of any offence or is insulting other nation.
- Contains software virus or any other computer code, file or program designed to interrupt, destroy or limit the functionality of any computer resource.
- Is patently false and untrue, and is written or published in any form, with the intent to mislead or harass a person, entity or agency for financial gain or to cause any injury to any person."
If the intermediary receives a court order or is notified by the government or a government agency that "any such information is hosted, stored or published," the intermediary will have 36 hours to take down or disable the information.
When a significant social media intermediary removes information that is determined to fall under one of the categories in the above list, the significant social media intermediary must notify the user who uploaded, shared or modified the content as to why the information was removed. Additionally, the significant social media intermediary will need to provide the user an "adequate and reasonable opportunity to dispute the action … and request … the reinstatement of access."
Grievance mechanism for all intermediaries
The rules describe a "grievance redressal mechanism" that all intermediaries must follow to allow individuals to bring a complaint alleging that the mandated rules have been breached. All intermediaries must appoint a specified grievance officer. The grievance officer's name and contact information, as well as instructions on how to submit a complaint, must be on an intermediary's website or app. Upon receipt of a complaint, grievance officers will have 24 hours to acknowledge the complaint and 15 days to resolve it.
Additional due diligence requirements for significant social media intermediaries
The rules contain additional due diligence requirements that significant social media intermediaries must follow. These include requirements to create oversight roles, issue compliance reports, comply with law enforcement requests, and proactively monitor and remove certain content from their platforms.
Social media intermediaries not deemed "significant" must also adhere to these additional requirements if their services "permit[] the publication or transmission of information in a manner that may create a material risk of harm to the sovereignty and integrity of India, security of the State, friendly relations with foreign States or public order." In such cases, the intermediary will be notified in writing by MeitY that they must also follow the rules specified for significant social media intermediaries.
Appointments to 3 new roles
Significant social media intermediaries will need to appoint individuals who are Indian residents for three new roles. First, they must appoint a chief compliance officer to ensure compliance with both the IT Act and rules. Second, they must appoint a nodal contact person for "24x7 coordination with law enforcement agencies." Finally, as mentioned above, they must appoint a resident grievance officer to manage the grievance redressal mechanism.
Proactive removal tools
Significant social media intermediaries must use "technology-based" tools to "proactively identify" instances of "any act or simulation … depicting rape, child sexual abuse or conduct" or any reposting of previously removed, unlawful information under Part II, Section 3(1)(d), such as information that is against "public order; decency or morality" or relates "to the interest of the sovereignty and integrity of India." The significant social media intermediary will need to notify any user that tries to "access such information stating that such information has been identified by the intermediary under the categories referred to in [Part II, Section 3(1)(d)]."
Compliance reports
Furthermore, significant social media intermediaries must publish a monthly compliance report that details "complaints received and action taken on the complaints as well as details of contents removed proactively by the significant social media intermediary."
Government access to data and civil liberties concerns
Additionally, significant social media intermediaries that provide "services primarily in the nature of messaging" will need to know and provide the identity of the "first originator" (i.e., the person who first sent a message) upon judicial order. Such judicial orders must relate to "prevention, detection, investigation, prosecution or punishment of an offence" that is connected to the security of India, international relations, or sexual assault or child pornography. If the first originator is located outside of India, the first originator in India will be considered the first originator of the information. While the identity of the first originator will be provided, the nature of the conversation need not be provided.
Although online commentators have suggested the new provision could be technically difficult to implement and "raises significant concerns over end-to-end encryption and encryption backdoors," some popular messaging apps subject to the rules, such as WhatsApp and Telegram, already have policies in place that suggest they may be able to comply with the first originator provision. In 2018, the Indian government requested WhatsApp provide the originator of a message, but WhatsApp refused, citing privacy concerns. Under the new rules, WhatsApp would now need to comply with a similar request or risk becoming liable for the activities of its users under Section 79(1) the IT Act.
Noncompliance
Noncompliance by intermediaries results in losing the protection of Section 79(1) of the IT Act. This section states "an intermediary shall not be liable for any third-party information, data, or communication link made available or hosted by him." Essentially, this means that intermediaries that do not comply will face liability for what a third party does on their platform or app.
Conclusion
The new rules will dramatically change the legal landscape and liabilities of intermediaries in India. Significant social media intermediaries have less than a month remaining to implement what could be technologically difficult changes to their services to comply with the rules. The IAPP will report on any significant or noteworthy updates as the rules enter into force in full.
Photo by Akshay Kumawat on Unsplash