A recent stand-off between TikTok and ShareChat, two major online social media platforms in India has brought to fore several issues and inconsistencies in India’s intermediary liability regime.
On August 23, it was reported that ShareChat, the ostensibly ‘homegrown’ social media app, complied with certain takedown notices sent by TikTok over content which had previously been shared on TikTok, claiming it had ‘exclusive rights’ over such content, and it was being uploaded and shared without authorisation on ShareChat. On August 27, it was reported that ShareChat had written to MEITY, claiming that TikTok’s actions and claims of ‘exclusivity’ over content on its platform were inconsistent with its status as an intermediary under the Information Technology Act (‘IT Act’), which TikTok denied in a statement. According to a report by Economic Times, ShareChat asked the Government to “clarify the intermediary liability status of platforms engaging in such practices.”
Subsequently, the issue appears to have been taken up by government officials as well as in Parliament, with questions being raised over TikTok’s takedown notices as well as dredging up issues over its ownership and control by Chinese companies, its security practices as well as its content moderation efforts.
The dispute raises a number of legal issues which it is important to consider at this stage, not just for TikTok and ShareChat, but for the state of platform regulation across India. It is also important to clarify important aspects of India’s intermediary liability regime.
There are a few distinct forms of intermediary regulation in India. The foremost and broadest of these is located within the IT Act, particularly under Section 2(w) read with Section 79. This is also the law relevant to our analysis.
As per these sections, an intermediary is defined as per the functions it serves for specific ‘electronic records’. As per Section 2(w), an ‘intermediary with respect to any particular electronic records, means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes …’ (emphasis mine)
Under Section 79, which provides exemptions or ‘safe harbour’ from liability for ‘third party information’ under certain circumstances. These circumstances include where “the intermediary does not– (i) initiate the transmission, (ii) select the receiver of the transmission, and (iii) select or modify the information contained in the transmission.” ‘(emphasis mine).
‘Transmission’ is not defined under the IT Act.
To go back to the present dispute, the broad question to be answered is – does TikTok’s claims of ownership or ‘exclusivity’ over certain content on its platform change its status as an intermediary qualified for the exemptions under Section 79? In my opinion, the statements made by certain lawyers and politicians that it does so are incorrect and misread the applicable law.
The language of the IT Act, which I have emphasised above, explicitly limits the functional qualification of intermediaries to a particular electronic record or to a specific transmission. The implication of this is that an ‘intermediary’ under the law should not be understood a general broad category applicable across all of its functions, but a legal category applicable when an entity is dealing with a particular ‘electronic record’ in a particular manner.
This would mean, therefore, that TikTok’s claims of copyright or control over content may exclude it from claiming safe harbour exemptions for the specific pieces of content over which it is claiming to exercise exclusive control (either as a copyright holder or on behalf of the copyright holder), on the grounds that such content no longer qualifies as ‘third-party information’, or that it is ‘selecting or modifying’ the transmission. If such content includes unlawful content, TikTok may be found liable under other applicable laws. However, on a textual reading of the law, this would not generally make TikTok liable for all content on its platform over which such control is not exercised.
This conceptual obscurity about the scope of Section 79 is not only seen in media statements but also increasingly present in various judgements, including otherwise well-reasoned judgements of High Courts. Moreover, that TikTok may still claim intermediary safe harbour is not to downplay that the platform has not adequately addressed accusations of displaying a lack of responsibility over unlawful content that it hosts – including child pornography and hate speech. The law as it stands – which requires judicial determinations over the unlawfulness of every specific piece of content, is clearly inadequate for regulating manifestly unlawful speech at scale which occurs on platforms.
What this does point to is the need to reframe and broaden the discussion around intermediary liability and platform regulation. As I have argued elsewhere, framing the discussion around present ideas of ‘liability’ for intermediaries is insufficient for governing contemporary content sharing platforms. What we need to work towards is a form of platform regulation which ensures that platforms are more transparent and accountable to their users and to the state, including perhaps the creation of a sectoral regulator able to audit the platform’s adherence to its codes of conduct and due processes when dealing with illegal content. Until the law is amended, however, it would be difficult to claim that TikTok is generally liable for the third-party content hosted over its platform.