Parliament's Culture, Media and Sport Select Committee said it would investigate the establishment's concerns about the public being supposedly swayed by propaganda and untruths.

The inquiry will examine the sources of fake news, how it is spread and its impact on democracy. Damian Collins, the committee chairman, said the rise of propaganda and fabrications is:

A threat to democracy and undermines confidence in the media in general. Just as major tech companies have accepted they have a social responsibility to combat piracy online and the illegal sharing of content, they also need to help
address the spreading of fake news on social media platforms, he said.

Consumers should also be given new tools to help them assess the origin and likely veracity of news stories they read online.

The committee will be investigating these issues as well as looking into the sources of fake news, what motivates people to spread it and how it has been used around elections and other important political debates.

The MPs want to investigate whether the way advertising is bought, sold and placed online has encouraged the growth of fake news. They also want to address the responsibility of search engines and social media to stop spreading it.

New research suggests that online hoaxes and propaganda may have only had limited impact in the US presidential election, however. According to a study by two US economists, fake news which favoured Donald Trump was shared 30 million times in the
three months before the election, four times more than false stories favouring Hillary Clinton. But the authors said that only half of people who saw a false story believed it, and even the most widely circulated hoaxes were seen by only a
fraction of voters.

A parliamentary committee is trying to get heavy with Facebook and Twitter over the release of details about Russian elections interference.

Damian Collins, chair of the Department of Culture, Media and Sport select committee, which is looking into so-called fake news, has given the companies until 18 January to correct their failure to hand over information he requested about Russian
misinformation campaigns on their platforms. He said:

There has to be a way of scrutinising the procedures that companies like Facebook put in place to help them identify known sources of disinformation, particularly when it's politically motivated and coming from another country.

They need to be able to tell us what they can do about it. And what we need to be able to do is say to the companies: we recognise that you are best placed to monitor what is going on your own site and to get the balance right in taking action
against it but also safeguarding the privacy of users.

But what there has to be then is some mechanism of saying: if you fail to do that, if you ignore requests to act, if you fail to police the site effectively and deal with highly problematic content, then there has to be some sort of sanction
against you.

In a letter to Twitter this month, Collins wrote:

The information you have now shared with us is completely inadequate ... It seems odd that so far we have received more information about activities that have taken place on your platform from journalists and academics than from you.

UK Parliamentary committee claims that people failing to vote the 'correct' way is nothing to do with politicians' crap policies that don't look after British people, and must be all to do with fake news

Parliament's Digital, Culture, Media and Sport (DCMS) Committee has been investigating disinformation and fake news following the Cambridge Analytica data scandal and is claiming that the UK faces a democratic crisis due to the spread of
pernicious views and the manipulation of personal data.

In its first report it will suggest social media companies should face tighter censorship. It also proposes measures to combat election interference.

The report claims that the relentless targeting of hyper-partisan views, which play to the fears and prejudices of people, in order to influence their voting plans is a threat to democracy.

The report was very critical of Facebook, which has been under increased scrutiny following the Cambridge Analytica data scandal.

Facebook has hampered our efforts to get information about their company throughout this inquiry. It is as if it thinks that the problem will go away if it does not share information about the problem, and reacts only when it is pressed, the
report said. It provided witnesses who have been unwilling or unable to give full answers to the committee's questions.

The committee suggests:

1. Social media sites should be held responsible for harmful content on their services

Social media companies cannot hide behind the claim of being merely a 'platform', claiming that they are tech companies and have no role themselves in regulating the content of their sites, the committee said.

They continually change what is and is not seen on their sites, based on algorithms and human intervention.

They reward what is most engaging, because engagement is part of their business model and their growth strategy. They have profited greatly by using this model.

The committee suggested a new category of tech company should be created, which was not necessarily a platform or a publisher but something in between.

This should establish clear legal liability for the tech companies to act against harmful and illegal content on their platforms, the report said.

2. The rules on political campaigns should be made fit for the digital age

The committee said electoral law needed to be updated to reflect changes in campaigning techniques.

It suggested creating a public register for political advertising so that anybody can see what messages are being distributed online political advertisements should have a digital imprint stating who was responsible, as is required with printed
leaflets and advertisements social media sites should be held responsible for interference in elections by malicious actors electoral fraud fines should be increased from a maximum of £20,000 to a percentage of organisations' annual turnover

3. Technology companies should be taxed to fund education and regulation

Increased regulation of social media sites would result in more work for organisations such as the Electoral Commission and Information Commissioner's Office (ICO).

The committee suggested a levy on tech companies should fund the expanded responsibilities of the regulators.

The money should also be spent on educational programmes and a public information campaign, to help people identify disinformation and fake news.

4. Social networks should be audited

The committee warned that fake accounts on sites such as Facebook and Twitter not only damage the user experience, but potentially defraud advertisers.

It suggested an independent authority such as the Competition and Markets Authority should audit the social networks.

It also said security mechanisms and algorithms used by social networks should be available for audit by a government regulator, to ensure they are operating responsibly.

Those members of parliament are half right at least. Democracy in Britain and the West is at risk today. But contrary to the wild claims in their fake-news report, the real risk does not come from Russian bloggers or shady groups farming Facebook
users' data. The big threat comes from political elitists like the cross-party clique of Remainer MPs who dominate the DCMS committee.

It looks a lot as if these MPs, like authoritarians from Moscow to Malaysia, have been inspired by the strikingly illiberal precedent set by Angela Merkel's social media law . In particular, part of the idea behind sticking social
media companies with legal liability is to scare them into going even further in muzzling free speech than the strict letter of the law requires.