There must be a better way of divining voter intentions. Could it be that people who may be unwilling to tell relative strangers from polling organisations their real intentions might be more likely to display their positions more honestly while debating an issue with their online communities on social media?

The EU project SENSEI, to which our Language and Computation lab in the School of Computer Science and Electrical Engineering (CSEE) participates, has set up the site SENSE-EU to test a new approach for monitoring voter intentions through their views as expressed on social media. The SENSE-EU site, which is monitoring views on the EU referendum in the UK, will test whether this form of monitoring results in more accurate predictions of their voting intentions than traditional polls.

Tuning in

The objective of the SENSEI project is to develop language technology tools that can be used to summarise conversations – both traditional, spoken conversations and the new forms of conversation that are proliferating online, for example in forums and among the comments below the lines on newspaper articles.

The project aims at doing this through the identification of the structure of such conversations – in particular, whether participants agree or disagree with each other and/or with each others’ opinions. In collaboration with The Guardian and The Independent, we have developed software that can automatically report which opinions are most debated – as in either most agreed with or most disagreed with – and which readers agree with which opinion, among other things.

The SENSE-EU site uses this software to produce an automatic assessment of voters’ intentions. Automatic monitoring of voter intentions using language technology has become a widespread form of polling; examples include TheySay’s tracking on views on the EU referendum on Twitter or the monitoring also of Twitter opinions carried out by academics from Edinburgh University, Claire Llewellyn and Laura Cram for European Futures. These systems rely mostly on opinion mining – i.e., whether posters express a positive or negative sentiment in statements about EU referendum-related topics such as immigration.

For instance, a poster may tweet: “This country is being swamped with immigrants”; this tweet would be recognised as being about immigration, and expressing a negative sentiment. By contrast, SENSE-EU tries to recognise whether posters agree or disagree with statements made by others about these topics – and uses the counts of agreements and disagreements to assess voter’s views.

In the example above, SENSE-EU would count how many replies to the original tweet are in agreement and disagreement with that tweet. This assessment of tweets on the basis of agreements and disagreements to them is similar to the validation mechanism used for quality control in crowdsourcing.

Keeping track of the numbers

It is interesting to note that the results obtained using SENSEI technology differ substantially from those obtained using sentiment analysis and both of these differ from those reported by poll trackers such as those of the Financial Times or The Daily Telegraph.

As I write this, both the Financial Times and the Daily Telegraph poll trackers indicate effective parity between the Remain and the Leave camp, but with a slight bias towards Remain. The Daily Telegraph: has Remain 51% and Leave on 49%. The Financial Times, meanwhile, has 45% opting for Stay while 43% opt for Leave.

The automatic trackers produce much clearer predictions. The TheySay sentiment tracker has indicated for months now an overwhelming majority for Brexit (as of June 7: 70% Leave, 30% Remain). By contrast, the SENSE-EU site had displayed until last week a bias in favour of remaining. This bias however exactly reversed on the weekend of June 5: as of today, the site reports a 53% to 47% preference for Leave as the graphic below shows.

Social media vs polls.CSEE, University of Essex, Author provided

It will be interesting to see which of the competing approaches to polling is a more accurate predictor of the results of the referendum. In particular, to see whether polling based on views expressed on social media is more or less accurate than polling based on views expressed to pollsters. Clearly, people are not necessarily entirely truthful in the picture of their views they present on social media, either. But intuition suggests that they might be less cagey than they are when asked questions by professional pollsters.