A Legal Scheme to Secure Elections against Social Media

Abhik Roy, BA-LLB, Utkal University, Odisha

Introduction

Election is the linchpin of democracy. Through elections, voters elect representatives to govern the country on their behalf. India is the largest democracy with more than 94 crore eligible voters and naturally elections are an integral part of it. Lately, social media has changed the face of Indian elections by virtually diminishing the gap between political parties and voters. During elections, social media makes public discourses and dissemination of information easy and helps the public make an informed decision.

However, social media is easy to exploit and has inevitably become a hotbed of dishonest activities targeting the purity of elections. That social media poses a massive threat to elections is better understood from the fact that India has around active 46.7 crore social media users. Therefore, dedicated regulations are needed to protect elections from the harm of social media. These regulations must create liabilities on all three key players in election—candidates and voters and intermediaries—to guarantee not one bad actor goes overlooked.

Accordingly, this paper seeks to outline a legal scheme of regulations to secure elections against social media manipulations. Part I explores different ways social media may manipulate elections. Part II analyses whether regulations for securing elections can restrict free speech of social media users. Finally, Part III deals with liabilities the regulations may create and a plan to effect those regulations.

Part I: Social Media Manipulates Elections

Social media cannot recognize deceit. Resultantly, it is not safe against those who seek to disrupt elections. They can easily exploit technological vulnerabilities of social media platforms to misrepresent the truth to voters and prompt them to make choices relying on facts that are false.

Public discourses are rife on social media during elections. Through discourses, different opinions are identified and heard. However, online trolling and cyber-bullying have become common tactics to derail discourses. Trolling is any act done deliberately to instigate a person by posting offensive messages and cyber-bullying is online harassment and intimidation of people. Many people resort to these tactics when they find a discourse inconvenient. Such tactics ensure the prevalence of certain opinions on social media, while other opinions are deliberately silenced.

People are also susceptible to herd-mentality. They tend to accept the popular opinion as their own believing the popularity of the opinion makes the opinion ‘a safe bet’. Astroturfing is a ploy on social media that exploits this herd-mentality. It is an act of deceiving people into believing that a particular candidate or a party or an opinion enjoys public support when in fact little support exists in its favour, effectively misleading the voters.

Astroturfing is done through both hired individuals and bots, their job being generating high volumes of posts on a topic favourable to their employer, ultimately making the topic trend and unsuspecting users believe the topic genuinely enjoys popular support.

Bots plague social media. They are automated software, and more persuasive and harder to detect than hired individuals. A study has found bots drive 10% to 20% of the conversations on issues like elections.

Social media is a fountainhead of misinformation. As such, voters on social media may make decisions using falsified information and sabotage their interests. Studies find a brief exposure to fake news can still influence people without their realising. Misinformation may also stir up political or religious unrest which may polarise people and cloud their judgement.

Often, simply an overload of information can harm voters. Information overload occurs when excess information is available to voters trying to make decisions. Research shows information overload can successfully handicap the decision-making ability of such voters, causing them to make a poor or no decision even.

‘Mere exposure effect’ is a psychological concept that says if one is exposed to something repeatedly, the chances of the person starting to like it increase. On social media, voters are exposed to political ads and other electoral matters like exit polls or opinion surveys; most of the exit polls and surveys circulating online are often unreliable or fake. Such unreliable, unwanted information can prejudice voters’ minds against their wishes.

Social media uses algorithms for personalising content for users. These algorithms recommend content based on what users regularly prefer. In terms of elections, such algorithms are inimical to informed-decision making as users are exposed to only those content that reinforces their political beliefs, causing them to believe their belief is popular.

Section 126 of Representation of People Act, 1951 (RPA) prohibits active election campaigning from around two days before the polling day to when polling ends to offer a silence period to voters so they may prepare their final decision free from distractions or influences. However, social media gives political parties and candidates enough elbow room to superficially observe the silence period but simultaneously breach them. They may do so by refraining from publishing new electoral content but sharing or liking existing electoral content on social media and bringing the content to the notice of voters who may be following their accounts.

Part II: Freedom of Speech or Integrity of Election

Social media manifestly threatens the sanctity of elections. Therefore, it becomes necessary to regulate the influence of social media on elections. In that pursuit, if a need to curtail rights of citizens of India arises, such need be met. No right of a citizen can take precedence over the urgent need to protect elections. While voters and candidates enjoy fundamental rights, it is essential to remember no right is absolute, especially when public interest is a rival to those rights.

The absolute freedom of speech and expression social media users enjoy should be subjected to reasonable restrictions to preserve the sanctity of elections. Without bona fide elections, a democracy is doomed, and without democracy, fundamental rights are unsustainable. Thus, saving elections must be the priority.

Both candidates and voters enjoy freedom of speech and expression even on the internet realm, however, as several judgements have reiterated fundamental rights can be curtailed in favour of public interest. In Govind v State Of Madhya Pradesh & Anr, the hon’ble Supreme Court had observed that they can be “restricted on the basis of compelling public interest.” Article 19(2) of the Indian Constitution supplies grounds for imposing reasonable restrictions on freedom of speech and expression. One ground is interests of the sovereignty and integrity of India, and the author contends electoral integrity is pivotal to those interests.

Electoral candidates seek to become a representative to serve the nation. Elections help them become a representative and, as such, they owe an obligation towards the whole electoral process. In that furtherance, they may have to accept certain restrictions on their rights, to ensure elections are held fairly. In Election Commission of India v Mukhtar Ansari and Others, the Delhi High Court while deciding a candidate’s right to canvass while being in custody for an alleged offence, observed, “No candidate can be permitted to do any act which interferes with the process of a free and fair election.”

Same holds true for voters. As citizens, they have the power to vote. However, if elections are sabotaged, their power will amount to nothing. To enjoy rights, one has to equally observe duties. It is a fundamental duty of citizens to save the sovereignty and integrity of India. Fundamental duties complement State obligations and are of constitutional significance. Also, freedom of speech and expression will be self-defeating if the freedom facilitates vitiation of the process of election upon which rides the chances of survival of democracy absolutely.

Part III: A Scheme to Regulate Social Media

This part draws a legal scheme to regulate social media during elections. Section A describes liabilities the scheme may put on voters and candidates and section B describes intermediary liability and section C recommends ways to give these liabilities legal effect.

Section A: The Liability of Voters and Candidates

Elections depend on both voters and candidates. If either of them fails to respect the sanctity of elections, elections will fail. With respect to social media where voters and candidates both are users ultimately and can equally hurt electoral integrity, it is tactical to place liabilities on them both.

Foremost, the scope of section 126 of RPA, 1951, should be extended to social media. Next, the author recommends that section 126 is amended and all social media users are made to observe the silence period. Presently, only political parties are required to observe the silence period. Voters are left out. Such exemption creates a situation wherein a voter can intentionally or unintentionally share or publish any election-related post during the silence period and effectively breach it. Moreover, if a candidate secretly hires a campaigner for oneself, the campaigner, being exempted, may continue to campaign even during the silence period. Hence, section 126 must not treat voters and candidates differently.

Section 127A of the RPA, 1951, prohibits publishing or printing of election pamphlets and posters without name and address of the printer and publisher. The section should be amended to include social media content. A major challenge to preventing misinformation on social media is that the content containing misinformation usually does not carry information about its originator. To ensure accountability, thus, the section should mandate that all election-related content released on social media by a party or candidate is digitally watermarked. Nowadays, highly sophisticated digital watermarking techniques are available that makes watermarks tamper-free and robust. The watermark will exhibit identification details of the publisher clearly.

The Model Code of Conduct (MCC) sets forth guidelines for candidates and political parties to observe during elections. The MCC deals with the part of an election that unfolds in the physical world. Hence, another code, similar to the MCC, but which concerns itself with the virtual world, is recommended.

The new MCC may be called as the ‘Model Code of Virtual Conduct’ (MCVC) and prescribe following guidelines:

          I.          Every candidate must receive a fresh account on every social media platform for election campaigning. The candidate must not use a personal account for the duration of election to post matters that may influence voters. This will help the Election commission of India (ECI) to monitor accounts easily.

       II.       Once the silence period begins, these election-special accounts must be deleted so no user is able to share any previous posts published by those accounts and effectively breach the silence period.

       III.       Candidates must avoid publishing any inflammatory or defamatory post on social media to curb misinformation.

       IV.       Candidates must not commit corrupt practices and electoral offences defined in RPA, 1951, on social media platforms; herein the meaning of ‘corrupt practices’ should be extended to include buying of likes or covertly hiring influencers for promotion or any such tactic resembling astroturfing.

        V.        Trolling or bullying or any form of online harassment by a party or candidate or its supporters should be banned.

       VI.       The use of influencers’ name or photo as endorsements without their explicit consent or spamming the comment sections of social media posts with election-related matters should be banned.  

     VII.     Every party should hold online public meetings in cooperation with ECI so it can actively track and remove any element of nuisance in real-time during the meeting, assisted by the social media platform.

    VIII.    Parties and candidates should regularly advise their supporters through their election-special accounts against resorting to illegal electoral activities such as hate speech.

       IX.       Candidates or parties must disclose details of all user accounts they hire for promotion to ECI. Only those disclosed accounts may participate in online campaigning.

        X.        Finally, ECI must appoint a team of social media observers for every significant social media platform. All candidates and their agents should be directed to notify the team if they find any issue threatening the fairness of election on the platform assigned to that team.

Section B: Intermediary Liability

Intermediary liability is the legal responsibility of the intermediaries to ensure no unlawful content is exchanged through them. The concept sounds reasonable since they are the facilitator of content exchange and are indeed at the best position to find unlawful content and prevent its exchange. 

However, the author contends the liability should not be absolute. A strict liability with no scope of immunity may push the intermediary, in fear of attracting liability, to censor any content of its own accord that it may find unlawful in the slightest. Such excessive censorship may pose a great threat to free speech that is indispensable during elections.

In Shreya Singhal v U.O.I, the Supreme Court declared intermediaries are only liable when they have not taken an action even after a vide order from court or the government or its agencies. Further, the Voluntary Code of Ethics for the 2019 General Election (for social media regulations during elections) required social media intermediaries to act on orders of election commission within three hours and endeavour to create awareness on electoral laws and process. 

The judgement and the Code are both welcomed. The author recommends they should together be the bedrock of all future provisions relating to intermediary liability in the context of elections since they limit the role of the intermediaries to only acting on the behalf of the commission and do not require intermediaries to practice censorship independently.

Further, the author suggests the goal of intermediary liabilities during elections should be towards pushing intermediaries to keep making themselves secure against any exploitation which may endanger elections. It is this area where they have more scope to act than ECI and should act actively therefore. And if the intermediaries do not, they must be held liable then.

A code of obligations may be drafted which the intermediary will need to satisfy for the duration of every election to avoid liability. The Voluntary Code of ethics should be made part of the code permanently. In addition to it, the code may oblige an intermediary to ensure the following:

    I.         Bot accounts are regularly looked for and removed.

   II.        Political ads or ads based on any issue relating to an ongoing election will be displayed to an user after the user’s consent to shield users from information overload.

   III.       It consistently attempts to improve its faulty algorithm.

    IV.       If it finds any account publishing too many election-related posts (a threshold to be set by ECI), it must notify the commission of the account so the commission may keep an eye on the account as such accounts are more likely to commit intentionally or unintentionally any electoral offence than other accounts.

Section C: Formalising the Scheme

Easier is to commit the offence, tougher should be its deterrent. In social media, electoral laws are easily flouted. Hence, it is recommended the Model Code of Virtual Conduct be made a part of RPA, 1951, and given the force of law. Indeed the existing MCC lacks such force. However, the scope of existing MCC is narrower comparatively. It controls the physical quarters where elections are going on and voluntary adherence of it may be enough. In contrast, the MCVC will tackle social media and social media is virtually endless and becoming increasingly opaque. As such, voluntary adherence of the MCVC will be insufficient.

One concern may arise that if the MCVC is made a part of RPA, 1951, the disputes relating to it will be adjudicated by courts and the whole process of sanctioning the offender will become complicated and drawn-out. To prevent such consequences, the MCVC should be treated differently from the rest of the Act. The disputes in cases of violations of the MCVC may be heard and adjudicated expeditiously by ECI itself. As having the matter adjudicated only though a court is not a constitutional right.

In Mohinder Singh & Anr v The Chief Election Commissioner, New Delhi & Ors, the Supreme Court held that a hearing does not need to be an elaborate ritual. In situations of quick dispatch, it may be minimal, even formal.

Moreover, appeal is not an inherent right. Accordingly, the decision of ECI may be made non-appealable to ensure the adjudication is not dragged further.

The constitution provides enough powers to Election Commission and it should be allowed to exercise it. Otherwise the whole purpose behind ECI fails. Under Article 324 of the Indian Constitution, the superintendence, direction, and control of the conduct of all elections to Parliament and to the Legislature of every State vests in Election Commission. The independence of a body is not solely dependent on factors like salary and term of service but also whether they have to rely on other bodies for their smooth functioning. If ECI has to depend on Central Government for better rules and on the judiciary for expeditious adjudication, then it is not yet the independent, strong constitutional body as the constitution had envisaged.

In Union of India v Association for Democratic Reforms and Anr, the Supreme Court explained the phrase ‘Conduct of elections’ in Article 324 of the Indian Constitution as to be “of wide amplitude which would include power to make all necessary provisions for conducting free and fair elections”.

In Kanhiyalal Omar v R.K. Trivedi and other, the Supreme Court was dealing with the constitutional validity of Election Symbols (Reservation and Allotment) Order, 1968 issued by the Election Commission. The court observed “any power granted by the Constitution for a specific purpose should be construed liberally so that the object for which the power is granted is effectively achieved,” confirming the commission had the authority to make the Symbols Order under Article 324 of the constitution.

In the light of the above decisions, the author suggests ECI may use its constitutional power to impose the code of obligations mentioned in Section B upon the social media intermediaries. Without such legal obligations on social media, Election Commission will not be able to achieve its purpose enshrined in Article 324 of the constitution.

Conclusion

Social media can aid or destroy elections. A democratic country like India cannot afford the latter. As such, it is necessary that the effects of social media on elections are controlled through dedicated laws, and electoral integrity is saved before it is beyond salvage. Indeed, such laws may restrict free speech, however, it is to be remembered that without free and fair elections, there will be no democracy to guarantee free speech. Intermediaries are in the best position to ensure they are not weaponized against elections and accordingly they must be obligated by law to consistently ensure they are technologically foolproof.

Cases Referred

Bar Council Of India v Union Of India, (2012) 8 SCC 243

Election Commission Of India v Mukhtar Ansari & Anr, (2017) 238 DLT 571 

Govind v State Of Madhya Pradesh & Anr, (1975) 3 SCR 946

In Re: Ramlila Maidan Incident DT.4/5.06.2011 v Home Secretary, Union of India & Ors, Writ Petition (CRL.) NO. 122 OF 2011

Kanhiyalal Omar v R.K. Trivedi and other, (1985) 4 SCC 628

Mohinder Singh & Anr v The Chief Election Commissioner, New Delhi & Ors, (1978) 3 SCR 272

Shreya Singhal v U.O.I, AIR 2015 SC 1523

Union of India v Association for Democratic Reforms and Anr, (2002) 5 SCC 294