Skip to main content

An Overview of the Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021 

Introduction 

Given the rising importance of the internet, both as a tool and an instrument of change, governments around the world regulated it in varying degrees. Whilst on the one hand there are the examples of China, Russia and countries in the Middle East which have historically controlled the use of the internet by their citizens with an iron fist, we also have examples of countries in the EU, Australia and New Zealand, which, although considered relatively “freer” societies, have started creating regulations around the internet and holding the “big-tech” giants more accountable in each of their jurisdictions.  

India, which until recently used to employ a rather soft-touch approach to regulating the internet, is also now gearing up for the modern age with some vigour, especially with a new data protection regime expected in the coming months (the PDP Bill). As part of its push to regulate internet in the modern age, the Indian government notified the Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021 (the Rules) on 25 February 2021. With the notification of the Rules, the Information Technology (Intermediary Guidelines) Rules, 2011 (the Old Rules) stand repealed.  

As was somewhat inevitable, the Rules have been met with criticism from various quarters and, as discussed in the following sections, have already become the subject of various lawsuits. This update looks at some of the most significant features of the Rules.  

Structure of the Rules  

Whilst the Rules have been promulgated by the Ministry of Electronics and Information Technology (MeitY) under Section 87 of the Information Technology Act, 2000 (the IT Act), they also seek to regulate publishers of (i) news and current affairs content, including news aggregators; and (ii) online curated content, in addition to regulating intermediaries, including social media platforms. Accordingly, the Rules are divided into two parts with the first part being reserved for the regulation of intermediaries by MeitY (the Intermediaries Section) and the second part being reserved for the regulation of digital media and online curated content by the Ministry of Information and Broadcasting (the OTT Section, and the ministry, MIB). 

Intermediaries Section  

1. Increased compliance: Guidelines in respect of the regulation of intermediaries around the world contain “safe harbour” provisions that provide protection to intermediaries, i.e., a list of actions and steps that, if undertaken by the intermediary in question, will provide it immunity from legal action if something objectionable is found on its platform. Such measures include publishing its privacy policy, user agreements and maintaining neutrality in respect of the content that is posted to its platform (i.e., not (i) initiating the transmission; (ii) selecting the receiver; or (iii) modifying the transmission in any way).  

The safe harbour provisions in the Rules include a few additional measures such as informing the users of the platform of any changes to the terms of use and/or consequences of non-compliance on at least an annual basis. In addition, intermediaries are now required to remove / take down information that is “patently false”, “misleading”, capable of causing injury to a person and is invasive of a person’s bodily privacy. Such a requirement places a substantial burden on such intermediaries as they would now be required to determine when a piece of information is “patently false”, “misleading” or capable of causing injury.  

Grievance redressal has also been made more stringent. Whilst the Old Rules also required intermediaries to appoint a grievance redressal officer to deal with complaints received from users, the Rules now impose a deadline of 24 hours within which such officers need to acknowledge the users’ communication, and a time frame of 15 days to resolve the dispute.  

Under the Rules, the requirement to retain data (such as user registration) after the cancellation of the user registration or withdrawal of a user registration has been increased from 90 days to 180 days – this is because the government has the power to compel intermediaries to divulge information to aid an investigation, and such records will help the government identify specific users. This requirement is in contrast with the general requirement under the PDP Bill to not retain data for longer than is strictly necessary.  

2. Creating different classes of intermediaries: A new class of intermediaries that have emerged globally are social media intermediaries – platforms such as WhatsApp, Twitter, Facebook and YouTube. As the name suggests, this class of intermediaries includes intermediaries which “primarily or solely enable online interaction between two or more users and allows them to create, upload, share, disseminate, modify or access information” using their services. The Rules, much like the PDP Bill, differentiate between two classes of social media intermediaries. Social media intermediaries with less than five million registered Indian users are considered ordinary intermediaries, whereas social media intermediaries with five million or more registered Indian users are termed as significant social media intermediaries. However, under the Rules, the government retains the flexibility to designate a social media intermediary with less than five million registered Indian users as a significant social media intermediary if it believes that such intermediary has the potential to cause “material risk of harm” to the integrity and sovereignty of the nation. Thus, the government could potentially ensure that each new service started by Facebook, Twitter or Google could automatically be classified as a significant social media intermediary, even if it has not obtained enough users while an Indian competitor may be allowed to rake up the adequate number of registered users before falling within this category. 

Significant social media intermediaries will need to comply with a host of additional compliance requirements by 24 May 2021. The most relevant of these is the requirement for significant social media intermediaries to have Indian resident employees be appointed as chief compliance officer, grievance redressal officer and a nodal person of contact (it being noted that these cannot be combined in a single person). In addition, significant social media intermediaries are now required to have a physical contact address in India in order for communications to be delivered to this address. In addition to increasing the compliance requirements for these companies as they will now need to hire employees resident in India, the Indian tax implications of having a “presence” in India by way of such local employees will also need to be evaluated. Moreover, given that the local compliance officers of these entities may also be subject to civil and/or criminal liability, it may prove to be difficult in practice for such entities to identify individuals that are willing to take on this role.  

Further, and in line with the PDP Bill, the Rules require significant social media intermediaries to provide an option to their users to voluntarily verify themselves (using their “Aadhaar” identification cards or their mobile numbers) such that their profiles will subsequently depict a “demonstrable mark” identifying such verification. Whilst this measure is purely voluntary in nature, it will likely increase compliance costs for significant social media intermediaries as they will need to provide the infrastructure to process these additional requests whilst still being in compliance with the principles of data protection enshrined in the PDP Bill.  

3. End to end-to-end encryption?: The short answer? Probably not. However, it is important to understand the nuances around this aspect before concluding that the Rules do not pose a threat to end-to-end encryption.  

The Rules require significant social media intermediaries which provide messaging services (such as WhatsApp, Signal, iMessage and Telegram) to ensure that they have the technology to identify the individual who sends a particular message. Whilst the Rules do not require the intermediaries to divulge (or for that matter, know) the contents of the message, the requirement to identify the first originator is enough to break the true end-to-end encryption service provided by the likes of Signal which do not even collect the meta-data related to their users.  

Added to this is the obligation on such intermediaries to use technology and artificial intelligence enabled tools to identify and remove unlawful content (such as offences related to the sovereignty of India, rape or child sexual abuse material) with an added layer of human oversight. It is important to note that such an obligation is in addition to the due diligence requirements that such social media intermediaries need to comply with in order to qualify for the safe-harbour provisions discussed above.  

It may well be that the government had only microblogging sites like Twitter or Koo in mind while including this obligation, as platforms such as WhatsApp and Signal would necessarily be required to break their end-to-end encryption in order to view and moderate the messages being passed on their platform. However, given that the Rules do not make this distinction, it could theoretically require such platforms to break end-to-end encryption, which such intermediaries may not be willing to do, and which could also impinge on the freedom of speech and expression rights of the users. 

OTT Section  

The Rules do something that no similar guidelines around the world have done – they include within their ambit publishers of digital news media (such as Editorji and Mojo Story) as well as online curated content libraries (such as Netflix and Amazon Prime). This inclusion is, of course, one of the most striking features of the Rules, especially given the Old Rules did not seek to govern such entities. The following are some of the other important features that such entities should be aware of.  

1. No classification: Unlike the classification sought to be carried out with social media intermediaries on the basis of the number of registered Indian users, digital news media as well as online curated content (together, OTT) is not sought to be classified, whether on the basis of user numbers or otherwise. Accordingly, any OTT provider, whether it is a start-up (which, for the most part, most new-age journalists are a part of) or an established organisation, and whether such organisation be based in India or abroad, is sought to be included within the ambit of the Rules. All that is necessary is that such OTT provider makes it content available in India in a systematic and continued manner. Theoretically, therefore, CNN’s website, NDTV’s Instagram feed and a journalist using Twitter to disseminate her work will all be subject to the same set of regulations and scrutiny. It is important to bear this distinction in mind especially because of the increased costs of compliance which will now have to be borne equally by all classes of OTT providers, including independent journalists.  

Further, there is a lack of clarity on how foreign OTT entities will practically be regulated in the country given the Rules do not require them to have a physical presence in the country though, as discussed below, they do require the appointment of local grievance redressal officers.  

2.Grievance Redressal: All OTT providers are now required to follow a three-tier grievance redressal mechanism. The first tier involves the establishment of an internal grievance redressal mechanism with the appointment of a grievance redressal officer, resident in India, much as significant social media intermediaries are now required to do. The second tier involves the formation of a self-regulating body which is to sit in appeal over any complaint that does not receive a satisfactory resolution in the first tier, as well as censure/admonish, require apology of, or even censor content provided by the publisher as it deems fit. Whilst this requirement does not seem particularly onerous on the face of it, given that the body will be a “self-regulating” body, it should be noted that this body needs to be registered with the MIB and the MIB needs to be satisfied with its composition. Accordingly, unless this body is composed of people which are acceptable to the MIB, the body may not receive a green light from the MIB.  

The third tier is the oversight mechanism which the government envisages introducing as part of the Rules. This oversight mechanism will be carried out by an “inter-departmental committee” with members of different departments of the government and will have the power to delete or modify content on all OTT providers for the prevention of incitement of violence. Again, whilst this may appear to be a reasonable restriction on the freedom of the OTT providers to show the content they would like to, certain examples such as the recent reaction to certain scenes in a recent show on one OTT’s platform, and the resultant self-censorship indicate that such powers may lead to significantly limiting what is shown on such platforms. 

3.Potential for censorship?: The provision that has the most potential for over-reach is the power provided to the government under the Rules to block content on all OTT providers without providing an opportunity of a hearing, in cases where the government states that no delay is acceptable, especially when its assessment, the content has the potential to incite violence or disturb friendly relations with other nations. Accordingly, the recent coverage by media outlets of the Indo-Chinese border conflicts could now theoretically be blocked by the government under the Rules. Whilst this does not mean that the government will actually censor such content, the fact that such censorship is possible may lead to a culture of self-imposed censorship.  

Conclusion  

As discussed above, the Rules are likely to have far-reaching consequences in respect of how content is created and hosted online, and how social media intermediaries regulate the content that is hosted or stored on their platform. Amid the current ongoing second surge of the Covid-19 pandemic in India, people took to social media platforms, especially Twitter, to criticise the government’s handling of the crisis, as well as to help each other get hospital beds, oxygen and other supplies. The government’s response was to ask Twitter to delete several of these tweets, pursuant to the government’s powers under the IT Act and the Rules, on the basis that they propagated “fake news”. This action has come under criticism from a number of quarters, including the Supreme Court of India.  

In addition, media organisations such as The Wire and News Minute have challenged these Rules in various High Courts and the Kerala High Court recently stayed any potential negative actions against media organisations until it makes a determination of whether these Rules are legal and valid. Additionally, the Parliamentary Standing Committee on Information Technology has also called into question the validity of the Rules and are conducting hearings with members of MeitY and MIB. It will be interesting to see how these Rules fare, both in front of the courts and on the floor of the Parliament.


 This material is for general information only and is not intended to provide legal advice.  For further information, please contact:  

Anuj Bhatia
Partner
anuj.bhatia@touchstonepartners.com  

 Punya Varma   

Download PDF