- Draft Information Technology [Intermediaries Guidelines (Amendment) Rules] 2018
The Information Technology [Intermediaries Guidelines (Amendment) Rules] 2018 has been formulated by The Ministry of Electronics and IT to prevent spreading of fake news, pornography , prevent misuse of social-media platforms and to provide security to the users.
THE KEY FEATURES OF BILL 2018
1)The 2018 Rules further provides that whenever an order is issued by the government agencies seeking information or assistance concerning cyber security, then the intermediaries must provide them the same within 72 hours. Initially such request could have been made only through writing but now the government has included that such requests can now be made even via electronic means.
2)The New Bill 2018 has amended the time frame from 36 hours to 24 hours .The recent rule also requires to “disable access” within 24 hours to content deemed defamatory or against national security and other clauses under Article 19 (2) of the Constitution. Now the companies having more than five lakh users to have an office in India duly registered under the Companies Act.
3)The 2018 Rules also requires appointment of a nodal officer who will be able to work with the law enforcement agencies round the clock. Initially in 2011 Rules, the intermediaries were required to store the information and such records for a period of ninety days but in 2018 Rules the time period has been extended to 180 days.
4)It is also notices that the government has taken steps to protect the Freedom of Speech and Expression as provided in the Constitution of India as no regulations have been drafted with respect to the contents appearing on the social media.
The European Union’s (EU) General Data Protection Regulation (GDPR) And Impact On Indian Intermediary Bill .
Nine Principles for Future EU Policymaking on Intermediary Liability
European policymakers and governments having concerns about the impact of several online content and user behaviour. Policymakers worry about content that may be illegal, such as some forms of hate speech, and content that is posted with the intent to incite violence for ideological and/or religious reasons.
- Policy and legislation must respect human rights principles on freedom of expression.
Legislators are obliged to abide by the principles laid down in human rights instruments, notably Article 19 of the International Covenant on Civil and Political Rights and Article 10 of the European Convention on Human Rights. Independent courts must remain the arbiters of what is and is not permissible speech, under clearly articulated laws. It should not be the case that de facto legal standards are set by company reviewers or automated content moderation tools, or delegated to administrative authorities. Moreover, if policymakers consider some of content unacceptable and harmful, and it is not illegal, it is their job to legislate for it (respecting the human rights and rule-of-law principles referred to above).
- Policy should be based on the principle that content creators are responsible, under the law, for their online speech and behaviour.
Content creators are responsible, under the law, for their online speech and behaviourand accountable for content they post, and how they otherwise behave. People should be aware that if they post content that constitutes, e.g., illegal hate speech or defamation, they can be prosecuted for it.
- Policy should be based on solid evidence, and targeted at well-defined and well substantiated public interest concerns.
. The Audio visual Media Services Directive calls for the setting up of codes of conduct in order to ensure that minors are not exposed to content that may be considered harmful to them. The current draft Terrorist Content Online Regulation would impose duties of care as well as a requirement on content hosts to suppress content that is deemed illegal under the regulation, within one hour of notification. Digital Services Act,should be carefully calibrated to focus on clearly defined problems that are not addressed in other legislation.
- Policy should ensure clarity about requirements for responding to notifications of illegal speech.
Instead of branding a content host for uploading illegal content ,it is essential that platforms’ efforts to restrict illegal content does not lead to a presumption of knowledge of illegality. Any new legislation should introduce a version of the Good Samaritan principle, in order to ensure that intermediaries are not penalized for good faith measures against illegal content. Sanctions should only be applied in cases of proven systemic failure to respond to valid notifications of illegal content.
- Content hosts should not be discouraged from, or limited in their capacity to moderate content.
As a principle, it is both legitimate and desirable that platforms restrict lawful content they do not consider, for whatever reason, appropriate for the service they provide. It is important to note, however, that the legal status of such content moderation is currently not clear. European courts have in certain cases ruled that a content host may not restrict lawful content, while in other cases ordering hosts to restricting which the host had not considered in violation of either the law or its own terms of service. Policy should seek to incentivise human rights-respecting content moderation, as recommended by the UN Special Rapporteur on Free Expression.
- Use of technological solutions for online content moderation should not be mandated by law
. Content moderation technologies are being used increasingly by a broad range of internet companies. They are not able to understand of context and meaning, which is necessary to determine whether a statement posted on social media may be considered to violate the law, or terms of service. Policymakers must understand these limitations and should not mandate the use of endorsing or adopting automated content analysis tools or impose time limits on responding to notifications of illegal content, which in practice will necessitate the use of automated filters to comply with the law.
- Responsibility for content should not be imposed on other companies than the content host.
Infrastructure service providers, payment providers, advertisers, cybersecurity providers, and others should not be held responsible for content their customers host. Companies lack both the information to effectively make decisions about whether speakers have violated content policies and risk over-censoring in order to avoid liability risks.
- Policy should promote, not hinder, innovation and entrepreneurship.
One of the most important and successful features of the limited liability provisions in the ECD is their capacity to encourage innovation and entrepreneurship. Had it not been for these protections, the many thousands of online sites and services that have appeared in Europe and beyond would not have grown and prospered. If new legislation undermines these protections, and introduces new responsibilities and obligations calibrated for global internet companies across the board, it will have a disproportionate negative impact on small companies and start-ups, and could further shrink the diversity of platforms and hosts available to support a broad range of expression online.
- Content hosts should not be forced to apply one Member State’s restrictions on free expression outside that country’s territory
Cross-border enforcement of restrictions would lead to unacceptable infringement of free expression and access to information rights. EU Member State laws vary considerably in how they restrict free expression. For example, some countries criminalise content such as blasphemy, while others have abrogated blasphemy laws; many countries prohibit hate speech but apply those prohibitions differently based on the cultural and historical context of their particular state. If hosts are required to apply one country’s speech restrictions broadly, they will inevitably encounter conflicts of law and the space for free expression and public debate would be severely curtailed.
LIABILITY OF ONLINE INTERMEDIARIES ON MEMBER STATE OF EU
AUSTRIA – Austria regulated the liability of online intermediaries by approving a federal bill to enact the Federal Telecommunications Statute in 1997, which held that owners of “broadcasting installations and terminals” (such as computer servers) were held liable, unless they would have taken appropriate and reasonable steps to prevent wrongful use of their equipment.
FRANCE – To counter the protests arising after the confiscation of the computer equipment of two internet access providers Francenet and Worldnet, the Minister of Telecommunication introduced a bill in 1996 to limit the liability of online intermediaries6. This bill exempted online service providers from criminal liability for third party infringements, provided they did not participate in these infringements, they offered filters to prevent access to certain services, and their services were not disapproved by the Committee of Telematics The proposed amendment was, however, annulled by the Constitutional Council due to formal errors
THE NETHERLANDS – The liability of online intermediaries was first addressed in the Netherlands in Bridgesoft v. Lenior15, in which a bulletin board operator was charged with direct copyright infringement, because it allowed its subscribers to upload and download pirated software. The court found the operator to be liable for copyright infringement, and also found that the operator had acted negligently since it should have been aware of the possibility of copyright infringements.
LIABILITY OF INTERMEDIARIES IN INDIA
“Intermediary” is defined in Section 2(1) (w) of the Information and Technology Act 2000. “Intermediary” with respect to any particular electronic message means any person who on behalf of another person receives stores or transmits that message or provides any service with respect to that message. The liability of the intermediaries is lucidly explained in section 79 of the Act.
SECTION 79 OF INFORMATION AND TECHNOLOGY ACT, 2000
Section 79 of the Information Technology Act, 2000 exempts intermediaries from liability in certain instances. It states that intermediaries will not be liable for any third-party information, data or communication link made available by them. The Act extends only to those instances where the intermediary merely acts a facilitator and does not play any part in creation or modification of the data or information. The provision also makes the safe- harbor protection contingent on the intermediary removing any unlawful content on its computer resource on being notified by the appropriate Government or its agency or upon receiving actual knowledge.
This provision was added to the Act by the Information Technology (Amendment) Act, 2008 on the demand of the software industry and industry bodies to have protection from liability that could arise because of user generated content.
2018 .has been tabled before Parliament in India .This is to be seen how far EUGDPR laws are compiled by Indian Legislature.
Advocate Shruti Bist, Supreme Court Of India
Chamber -63 ,lawyers Chamber Supreme Court
email [email protected]