You Facilitate We Penalize: How Intermediary Liability Works

By Prakhar Raghuvanshi

The Internet is a unique medium of expression and deserves a different standard of protection than other media. Any organization which facilitates transactions between third party over the internet is an internet intermediary.1 It primarily gives access to the host and transmits the content originated from third parties. It does not create or own the content which is published or broad-casted. The question that  arises is whether such a company can be held liable for a publication, as a school of thought contends that doing so would be to hold a telephone company liable for threats given over the phone by one person to another. If yes, can a State compel an intermediary to remove such posts?

Liability of an Intermediary

The backbone of intermediary liability in India is Section 79 of the Information Technology Act, 2000 [“IT ACT”]. The intermediary is not liable, if its function is limited to providing a communication system; it does not initiate the transmission, select the receiver or select or modify any piece of information contained therein; it observes due diligence. On the other hand, it is liable in case it conspires, abets, aids, or induces the commission of an unlawful act; or receives actual knowledge about an unlawful content from the appropriate government and fails to remove or disable access to that content.

The Draft rules of IT Act proposes that an intermediary shall remove or disable access to unlawful acts which comes under the ambit of Article 19(2) of the Constitution of India, when it receives such knowledge from the government or through a court order, and it should provide assistance within 72 hours to the government agency.  Further, in the same set of rules, an intermediary is given immunity provided that it does not knowingly host, select or modify any information contained in a transmission.

However, one of the Manila Principles on Intermediary Liability is that intermediaries should be shielded by law from liability for third-party content. In Myspace Inc. v. Super Cassettes Industries2 the court said that intermediaries only serve the purpose of being a conduit for the exchange of information between users and they are not obligated or equipped to verify all the information which is stored on their website. Intermediaries are simply providing the infrastructure to share content and have nothing to do with the content itself.

In the United States, Section 230(c)(1) of the Communications and Decency Act, 1996 says that no provider of an interactive computer service can be treated as a speaker or publisher of any information provided by another content provider. It was held in Zeran v. AOL3 that this was enacted to maintain the robust nature of online communication and keep the government interference in the medium to minimum.

The ECD and DMCA safe harbors which are applied to providers of hosting services protect them from liability for content stored on their platforms, provided that they must not have actual knowledge of the illegal content. An intermediary cannot be held liable because it can claim benefit under the safe harbour provision if they do not have actual knowledge that the material on the system is infringing.4

 Enforcement of ‘take-down’ orders

Section 69A of the IT Act empowers the Central Government to issue directions to intermediaries for blocking public access to any information through any computer source on grounds mentioned in Article 19(2) or for preventing incitement of any cognizable offence. Non-compliance with such directions attracts imprisonment up-to seven years and fine. The problem in such cases lies in enforcement, when the intermediary is situated outside the territory of the country. ‘Take-down’ orders need extra-territorial application of laws as it associated with a removal order from the government of one state to an intermediary based in another state.

Laws that order intermediaries to take down or block certain content have the potential to impair online expression or access to information and therefore intermediary action requirements should be narrowly drawn. One cannot prohibit a website or an information dissemination system to publish materials solely on the basis that it may be critical of the government.  States are compelling such platforms to monitor user-generated content and subjecting them to extra-territorial content takedown rules which is a restriction for freedom of speech and expression. This defies the whole purpose of existence of such intermediaries in the first place, which is to serve as a vehicle for the universal right to freedom of expression. They act as conduits and thus are not able to assess the context surrounding the unlawful content that they are being compelled to remove.

The US Supreme Court in Calder v. Jones5 observed that the objective territoriality or effect test is applied to see if the effects of the web platform have resulted in some harm or injury to the complainant State. The test requires proof that the intermediary specifically directed its content towards the State with the intention to cause injurious effects in its territory. In addition to this, to compel an intermediary and subject it to the jurisdiction of the State, it must be shown that the intermediary purposefully availed the jurisdiction of the State by specifically targeting the public in the State in contravention of an existing statute (which in India’s case is the IT Act).

Concluding Remarks

Liability of intermediary is treated very differently in India as compared to other jurisdictions. The threshold for liability is lower in other jurisdictions, which proves out to be better. The Indian statute is strict on liability and the takedown order which follows it. There are however two major issues with it.

Firstly, subjecting an intermediary to outside jurisdiction is troublesome and it is even more troublesome and difficult if the said intermediary does not officially operate in that State. Solely because it might not appear sound to the government in a country, it cannot be subjected to takedown orders. For instance, if there is a post on based on an unpopular opinion about Hinduism which the government might find, subjecting it to take-down order would be wrong in my opinion. There is a lack of efficient and proper mechanism to figure out what content can be subjected to takedown orders. Municipal laws lay down the grounds on which such content can be taken down but whether they meet international standards or not is the relevant question.

Secondly, assuming that the content is subjected to a takedown order and we have proper and efficient mechanism to determine the same, the intermediaries lack the technical abilities to take it down only from a particular jurisdiction. Lack of technical abilities on the part of intermediaries cannot and should not be a ground to restrict freedom of speech and expression. The draft rules provide provisions for proactive filtering, such a process has to be balanced with technical infeasibility i.e. whether the intermediary possesses the means to remove the content or not. This process might be difficult especially for small players in the market.

In addition to this, imposing a proactive filtering goes against the rulings of the Supreme Court which sees intermediaries merely as a neutral facilitator.6 If proactive filtering is carried out, intermediaries will seize to be mere neutral facilitator. Instead, it will become an adjudicator of the content. Lastly, allowing the intermediary to monitor content is a violation of the right to privacy as well and hence against the SC ruling in Justice K.S. Puttaswamy v. Union of India7. What I have argued throughout is not that such intermediaries must be exempted from liability, but the threshold for the same should be just and proportionate keeping in mind the rights involved and the technological limitations. The content which can be subjected to takedown orders must be clearly defined and any vagueness or ambiguity must be expunged. The provisions must be brought in accordance with the Supreme Court rulings and jurisprudence from other jurisdictions should be considered as well.

The author is pursuing their 2nd year of law, at the National Law University, Jodhpur.

  1. See also The Information Act, 2000, No. 21, Acts of Parliament, 2008, S. 2(w).
  2. Myspace Inc. v. Super Cassettes Industries, [(2017) DLT 236 (Del.) 478].
  3. Zeran v. AOL, [129 F 3d 327 (1997)].
  4. Barnhart v. Sigmon Coal Co., 534 U.S. 438, 440 (2002); Google India Private Limited v. M/S Visaka Industries Limited, (2017) ALT 1 (AP) 620.
  5. Calder v. Jones, 465 U.S. 783, 790 (1984).
  6. Shreya Singhal v. Union of India, 2015 (5) SCC 1.
  7. Justice K.S. Puttaswamy v. Union of India, 2017 (1) SCC 10.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s