Disinformation and fake news spread online threatens the future of democracy in the UK, a parliamentary committee has warned.
In order to clean up the “wild west” world of social media, the government should tighten regulation concerning technology companies’ liability for the sharing of false content on their platforms, the Digital, Culture, Media and Sport Committee recommended in a report published on Sunday.
The report comes ahead of a roll out of legislation aimed at overhauling online safety and internet use, which is expected to be initiated by the government later this year.
“In this rapidly changing digital world, our existing legal framework is no longer fit for purpose,” the report said.
“Our democracy is at risk, and now is the time to act, to protect our shared values and the integrity of our democratic institutions.”
‘Not passive platforms’
The report, released after months of investigation into fake news by the committee, also called on the government to begin auditing the security measures and algorithms employed by technology companies.
Algorithms enable platforms to prioritise content flows, meaning information may be given prominence based on its perceived relevance to the user instead of time of publication.
“Tech companies are not passive platforms on which users input content; they reward what is most engaging … They have profited greatly by using this model,” the report said.
“This manipulation of the sites … must be made more transparent,” it added.
|The committee has repeatedly asked Facebook CEO Mark Zuckerberg to appear at a hearing in order to answer outstanding questions concerning the platform [File: Nam Y. Huh/AP]|
Tech giant Facebook came under heavy scrutiny in the report, which accused the company of “obfuscating” when asked to answer questions regarding possible interference by foreign governments – including Russia – in UK political campaigns via the platform.
In particular, the committee was interested in determining whether Moscow had funded the placement of political adverts on Facebook during the 2016 EU referendum, which saw the UK vote to leave the 28-member bloc.
Last year, UK Prime Minister Theresa May accused Russia of “planting fake stories” to “sow discord in the West and undermine our institutions”, allegations that Moscow has repeatedly denied.
According to evidence initially supplied to the committee by Facebook, the St Petersburg-based Internet Research Agency (IRA) bought only three adverts worth $0.97 in the days prior to the Brexit vote.
Subsequent internal investigations by the company, prompted by the committee’s scrutiny, found no additional activity tied to Russia.
The committee’s report, however, said Facebook’s investigations failed to include any examination of unpaid posts and, instead, solely focused on the IRA “troll farms”.
Facebook said in a statement following the report’s publication the committee’s findings had raised “some important issues” and pledged to work with UK officials to develop new transparency tools.
“We share their goal of ensuring that political advertising is fair and transparent and agree that electoral rule changes are needed,” Richard Allan, vice president of policy, said.
The report, published four months after the Cambridge Analytica data scandal, also examined tech companies management of their users’ data.
In March, reports surfaced that the now defunct consultancy firm had illegally obtained personal information from millions of Facebook users. A month later, British politicians revealed that the pro-Brexit campaign group Leave.EU benefited from the work conducted by the firm.
Sunday’s report concluded the government must do more to guarantee the protection of users’ data, and called on policymakers to consider creating a digital “Atlantic Charter” with the US as a means of establishing cooperation over the legal obligations placed on technology companies.
Full Fact, the UK’s independent fact-checking charity, said following the report’s publication the government should act immediately to protect democracy but warned against “overreacting”.
“One of the biggest risks here is government overreaction,” Full Fact said in a tweet. “The cure could be worse than the disease. Action must be taken that both protects free speech and limits the harm from misinformation.”
One of the biggest risks here is government overreaction, which we’ve seen happen in many countries around the world. The cure could be worse than the disease. Action must be taken that both protects free speech and limits the harm from misinformation.
— Full Fact (@FullFact) July 28, 2018
Last year, the government announced it would introduce new laws to make Britain the safest place in the world to be online. The policy proposals are expected to be published by the end of 2018.