The madness, the anger, the bitter campaigns, the personal attacks, the unsavoury use of history, the sycophancy and the defiance, the arrogance and the posturing of politicians, the “Khan market gang” jibe, the way money influences election outcomes, the dynasty in Indian politics, the divisiveness of saffron surge, the chasm between voter aspirations and poll promises, the fringe and its rabid intolerance—do all these make politics less watchable? Given that on May 23, millions of Indians will settle down in front of their television sets, plonk themselves in front of their office desks with computers tuned into live broadcasts, or sink deep into their mobiles for a long and thrilling day wherever they are, this doesn’t seem to be the case.
As votes will be counted, viewers will receive not just the final tally of votes. An excess of analysis and commentary about the voting patterns and the state of our democracy will be available too in the bargain. But before that happens, no matter where we get the final tally of winners from, much of the preceding analysis and predictions have come from one source: exit polls, the exercise of interviewing voters just as they vote.
Exit polls 2019
By this time, we all know what the overwhelming majority of exit polls have predicted. As I sank on the couch last evening, the television screen flashed the poll predictions. While the numbers for each political party in the fray varied from poll to poll, the broader trend was resoundingly clear: the Narendra Modi-led National Democratic Alliance government will likely return to power. A casual analysis of the numbers suggests a Bharatiya Janata Party win over Congress in the states of Gujarat, Rajasthan, Chhattisgarh and Madhya Pradesh. Not just this, in a first, the BJP has emerged as a big challenger to regional parties in West Bengal and Odisha, eating away into the Left and the Congress bastions. The Samajwadi Party-Bahujan Samaj Party seem to be on the rise too, a far cry from 2014 and 2017 elections in Uttar Pradesh.
Until May 23 when the counting of votes gets underway, we don’t have a way to verify these predictions. But as it happens with exit polls every election, we do have questions. Predominantly: are exit polls always correct?
The answer may lie somewhere in the way the TV channels declared the polls on Sunday and what they predicted. On one end of the spectrum were channels like Republic TV where guest speakers arrived in their cars right up to the studio and CNN-News18 which had their anchor perched on a CG-powered helicopter taking an aerial view of the parliamentary constituencies. All this drama wasn’t for nothing.
This over-hyped presentation of news shows is at its core a promotional activity usually commissioned by news channels at the time of poll broadcasts. While not all of them on Sunday evening were this dramatic, almost everyone out there screamed how accurate, credible and intelligent their predictions were. A few of them, aware of the credibility crisis exit polls have faced in recent years, even declared that they were the only ones to get the numbers right. Yet, none of them asked questions critical to the accuracy of the numbers being predicted: the details on methodology, sampling, questions asked, how vote share was calculated, and more.
The vehement declarations on the accuracy of exit polls floated on TV screens irrespective of the channels, placing great emphasis on the tally on display as against the relentless reporting of the past weeks replete with ground reportage. At this moment, brilliant pieces of journalism and intelligent analyses by experienced editors and reporters faded, and psephologists took over. Along the way, ABP-Nielsen that had earlier declared 267 seats for NDA changed its NDA tally to 277, explaining that the changes were made after polling at 2 pm on Sunday, raising eyebrows. Blame it on the demand driven model of exit polls as well—TV channels and the Internet have added to the impatience and brief attention span of viewers, who now obsess about the numbers more than the process and the stories that went into conducting and covering the world’s largest democratic exercise.
However, as numbers are placed on the board, their difference on the prediction tally could be indicative of all the possible errors. On Sunday’s exit polls, there was divergence in the tallies: between then ABP-Nielsen and CNN-News18-IPSOS numbers for the NDA, there was a difference of 59 seats. Likewise for the UPA, a difference of 50 seats between the lowest number (CNN-News18-IPSOS) and the highest (Times Now-VMR) on the tally. Total tally for the NDA by India Today’s Chanakya (2014) and Nielsen (2009), who got the predictions right in the past, diverged by 73 seats.
Even state-wise predictions varied: there were huge differences in predictions for Uttar Pradesh, Odisha and West Bengal made by Nielsen, C-Voter, Chanakya and Jan ki Baat. Even the exit poll vote share predictions vary but the gap between the NDA and the UPA stayed consistently large. But it also suggested that the UPA could have gathered more votes than the last elections in 2014.
Yet, the methodology used for these polls wasn’t clear beyond the sample size quoted by a few channels. While CNN-News18-IPSOS and India Today put out details online on how their exit polls were conducted, other channels—in a bid to present the numbers differently—focussed on different aspects of the polls to declare results. India Today began the polls with states in the South, moving on to the West, North and East; Republic went ahead with the total tally first, moving on to analysing state wise predictions; NDTV, which had a “poll of polls”, had more people on the screen than the space given to their poll, indicating average of all the exit poll numbers.
The real challenge for exit pollsters, though, lies in converting seat share into number of seats won, and this is where the math often goes wrong. Add to this the challenge of an adequately representative sample. Also, there are important factors like voter turnout, vote share, proper sampling, adequate sample size, rural-urban divide and choice of booth. Any miscalculation, missing or over-calculation of any data type or size reflects poor results, and that is why transparency in methodology used for these exit polls becomes very important. While the India Today exit poll claimed a sample size of seven lakh as against five lakh for C Voter-Republic, the demography and caste composition giving a sense of the representation of this sample size wasn’t clear.
Given the past record of exit polls, silence around the methodology adopted to calculate vote shares—and to convert vote shares into seat shares—has existed for a long time. In this piece, veteran journalist Rajdeep Sardesai says methodology has always been a grey area in the context of exit polls, the opacity and lack of curiosity around it having compromised media’s understanding of elections over the years. Transparency of polling agencies around methodological details could enable sharper and more scientific exit polls.
Yet, because this is missing today, replaced by psephologists and election experts on media panels, exit polls are more like an astrologer’s day at the science fair with its share of hits and misses, but never failing to amuse.
Not surprisingly, given the lack of clarity on methodology, exit polls have often turned out wrong in the past, and not just in India. The reasons for the inaccuracy of exit polls could also include voters lying about who they actually voted for, the absence of random and representative sampling, biases in questioning or simply bad fieldwork.
Exit polls in 2014 had predicted the NDA to emerge victorious in the elections. But none of them, with the sole exception of India Today‘s Chanakya, accurately predicted the extent of the NDA’s victory. The NDA won 336 of the 543 seats and the BJP secured 282 seats on its own. The UPA, on the other hand, won merely 60 and the Congress was reduced to its lowest ever tally of 44. News24-Chanakya exit polls had predicted a landslide victory for the NDA with 340 seats. According to this, the UPA was likely to be reduced to its lowest ever tally of just 70 seats.
Slightly more in the mid-range was the Times Now-Org Exit Poll, which gave NDA 249 seat, 23 seats less than the midway mark. This poll also gave the UPA 148 seats, which was more than the 60 they actually won and the 119 they got in 1999. The CNN IBN-CSDS polls and the Headlines Today-Cicero polls predicted between 261 and 283 seats for the NDA and between 92 and 120 seats for the UPA. The India TV-C voter and ABP-Nielsen polls both predicted an NDA victory in the range of 281 to 289.
As we know now, none of these numbers matched.
On Sunday, as exit polls were relayed on television channels here, comparisons were also made with the recently concluded elections in Australia where the conservative party emerged as surprise winner of elections even as more than 50 pre-poll surveys declared its ouster from power. Nevertheless, the usual caveat here is that in Australia, the surprise was with respect to opinion polls and not exit polls, which are different, besides other reasons such as the waning popularity of the Labor prime minister.
If there is any margin of error, why are exit polls conducted at all? What is their purpose? Do exit polls merely serve the purpose of helping the media to “call” elections a few days earlier than the official results? This process of calling elections and the race among media organisations to be the first to do so, invariably serves a recreational purpose when telecast on TV with all the dramatic statements and high decibel debates. But does this in anyway contribute to the democratic process?
Possibly, because well-executed and well-analysed exit polls allow correct predictions. A second purpose that exit polls might serve is to provide valuable information about the electorate. While for journalists and the public, the importance of exit polls may end with the noise over seat projections and the maddening race for channels to get the numbers out first and accurately so, psephologists and sociologists could have valuable lessons to draw from exit polls.
What we should be asking of exit pollsters
Besides methodology and representative sampling, exit polls have other issues to address too. Research into exit polls has revealed that the form, wording and order of questions can significantly affect poll results. Whatever the technique may be, it is important to understand how a poll was conducted and what was asked. Do we know this about the exit polls aired on Sunday? No.
Often, bias in exit poll surveys occurs when respondents provide socially acceptable answers rather than their true opinions on controversial issues. In the Indian context, Indian voters have dealt with issues such as Sabarimala, #MeToo, triple talaq and demonetisation to vote on, but the exit polls are silent on whether these factors were taken into account while surveying. A host of channels such as NDTV and Tiranga TV reported exit polls on the basis of “poll of polls,” where multiple polls were averaged together. Methodological arguments over how to do this accurately exist and some statisticians have objections to such mixing of polls.
Everyone makes mistakes and so do polling agencies. Media coverage of exit polls could begin with media asking questions of pollsters on methodology, sampling and other details. Errors can be diagnosed and corrections made only when there is ample information on the process. Pollsters and partnering media channels need to come together and discuss ways to overcome the credibility crisis they are facing. Besides, accountability for wrong forecasts should be built into partnerships pollsters enter into with the media.
Until this happens, as they say, the only poll that will continue to matter will be the one on counting day!
This Op-ed piece first appeared at Newslaundry.