A UK parliamentary committee which has been running a multi-month investigation into the impact of online disinformation on political campaigning — and on democracy itself — has published a preliminary report highlighting what it describes as “significant concerns” over the risks to “shared values and the integrity of our democratic institutions”.
It’s calling for “urgent action” from government and regulatory bodies to “build resilience against misinformation and disinformation into our democratic system”.
“We are faced with a crisis concerning the use of data, the manipulation of our data, and the targeting of pernicious views,” the DCMS committee warns. “In particular, we heard evidence of Russian state-sponsored attempts to influence elections in the US and the UK through social media, of the efforts of private companies to do the same, and of law-breaking by certain Leave campaign groups in the UK’s EU Referendum in their use of social media.”
The inquiry, which was conceived of and begun in the previous UK parliament — before relaunching in fall 2017, after the June General Election — has found itself slap-bang in the middle of one of the major scandals of the modern era, as revelations about the extent of disinformation and social media data misuse and allegations of election fiddling and law bending have piled up thick and fast, especially in recent months (albeit, concerns have been rising steadily, ever since the 2016 US presidential election and revelations about the cottage industry of fake news purveyors spun up to feed US voters, in addition to Kremlin troll farm activity.)
Yet the Facebook-Cambridge Analytica data misuse saga (which snowballed into a major global scandal this March) is just one of the strands of the committee’s inquiry. Hence they’ve opted to publish multiple reports — the initial one recommending urgent actions for the government and regulators, which will be followed by another report covering the inquiry’s “wider terms of reference” and including a closer look at the role of advertising. (The latter report is slated to land in the fall.)
For now, the committee is suggesting “principle-based recommendations” designed to be “sufficiently adaptive to deal with fast-moving technological developments”.
Among a very long list of recommendations are:
- a levy on social media and tech giants to fund expanding a “major investment” in the UK’s data watchdog so the body is able to “attract and employ more technically-skilled engineers who not only can analyse current technologies, but have the capacity to predict future technologies” — with the tech company levy operating in “a similar vein to the way in which the banking sector pays for the upkeep of the Financial Conduct Authority”. Additionally, the committee also wants the government put forward proposals for an educational levy to be raised by social media companies, “to finance a comprehensive educational framework (developed by charities and non-governmental organisations) and based online”. “Digital literacy should be the fourth pillar of education, alongside reading, writing and maths,” the committee writes. “The DCMS Department should co-ordinate with the Department for Education, in highlighting proposals to include digital literacy, as part of the Physical, Social, Health and Economic curriculum (PSHE). The social media educational levy should be used, in part, by the government, to finance this additional part of the curriculum.” It also wants to see a rolling unified public awareness initiative, part-funded by a tech company levy, to “set the context of social media content, explain to people what their rights over their data are… and set out ways in which people can interact with political campaigning on social media. “The public should be made more aware of their ability to report digital campaigning that they think is misleading, or unlawful,” it adds
- amendments to UK Electoral Law to reflect use of new technologies — with the committee backing the Electoral Commission’s suggestion that “all electronic campaigning should have easily accessible digital imprint requirements, including information on the publishing organisation and who is legally responsible for the spending, so that it is obvious at a glance who has sponsored that campaigning material, thereby bringing all online advertisements and messages into line with physically published leaflets, circulars and advertisements”. It also suggests government should “consider the feasibility of clear, persistent banners on all paid-for political adverts and videos, indicating the source and making it easy for users to identify what is in the adverts, and who the advertiser is”. And urges the government to carry out “a comprehensive review of the current rules and regulations surrounding political work during elections and referenda, including: increasing the length of the regulated period; definitions of what constitutes political campaigning; absolute transparency of online political campaigning; a category introduced for digital spending on campaigns; reducing the time for spending returns to be sent to the Electoral Commission (the current time for large political organisations is six months)”.
- the Electoral Commission to establish a code for advertising through social media during election periods “giving consideration to whether such activity should be restricted during the regulated period, to political organisations or campaigns that have registered with the Commission”. It also urges the Commission to propose “more stringent requirements for major donors to demonstrate the source of their donations”, and backs its suggestion of a change in the rules covering political spending so that limits are put on the amount of money an individual can donate
- a major increase in the maximum fine that can be levied by the Electoral Commission (currently just £20,000) — saying this should rather be based on a fixed percentage of turnover. It also suggests the body should have the ability to refer matters to the Crown Prosecution Service, before their investigations have been completed; and urges the government to consider giving it the power to compel organisations that it does not specifically regulate, including tech companies and individuals, to provide information relevant to their inquiries, subject to due process.
- a public register for political advertising — “requiring all political advertising work to be listed for public display so that, even if work is not requiring regulation, it is accountable, clear, and transparent for all to see”. So it also wants the government to conduct a review of UK law to ensure that digital campaigning is defined in a way that includes online adverts that use political terminology but are not sponsored by a specific political party.
- a ban on micro-targeted political advertising to lookalikes online, and a minimum limit for the number of voters sent individual political messages to be agreed at a national level. The committee also suggests the Electoral Commission and the ICO should consider the ethics of Facebook or other relevant social media companies selling lookalike political audiences to advertisers during the regulated period, saying they should consider whether users should have the right to opt out from being included in such lookalike audiences
- a recommendation to formulate a new regulatory category for tech companies that is not necessarily either a platform or a publisher, and which “tightens tech companies’ liabilities”
- a suggestion that the government consider (as part of an existing review of digital advertising) whether the Advertising Standards Agency could regulate digital advertising. “It is our recommendation that this process should establish clear legal liability for the tech companies to act against harmful and illegal content on their platforms,” the committee writes. “This should include both content that has been referred to them for takedown by their users, and other content that should have been easy for the tech companies to identify for themselves. In these cases, failure to act on behalf of the tech companies could leave them open to legal proceedings launched either by a public regulator, and/or by individuals or organisations who have suffered as a result of this content being freely disseminated on a social media platform.”
- another suggestion that the government consider establishing a “digital Atlantic Charter as a new mechanism to reassure users that their digital rights are guaranteed” — with the committee also raising concerns that the UK risks a privacy loophole opening up after it leave the EU when US-based companies will be able to take UK citizens’ data to the US for processing without the protections afforded by the EU’s GDPR framework (as the UK will then be a third country)
- a suggestion that a professional “global Code of Ethics” should be developed by tech companies, in collaboration with international governments, academics and other “interested parties” (including the World Summit on Information Society), in order to “set down in writing what is and what is not acceptable by users on social media, with possible liabilities for companies and for individuals working for those companies, including those technical engineers involved in creating the software for the companies”. “New products should be tested to ensure that products are fit-for-purpose and do not constitute dangers to the users, or to society,” it suggests. “The Code of Ethics should be the backbone of tech companies’ work, and should be continually referred to when developing new technologies and algorithms. If companies fail to adhere to their own Code of Ethics, the UK Government should introduce regulation to make such ethical rules compulsory.”
- the committee also suggests the government avoids using the (charged and confusing) term ‘fake news’ — and instead puts forward an agreed definition of the words ‘misinformation’ and ‘disinformation’. It should also support research into the methods by which misinformation and disinformation are created and spread across the internet, including support for fact-checking. “We recommend that the government initiate a working group of experts to create a credible annotation of standards, so that people can see, at a glance, the level of verification of a site. This would help people to decide on the level of importance that they put on those sites,” it writes
- a suggestion that tech companies should be subject to security and algorithmic auditing — with the committee writing: “Just as the finances of companies are audited and scrutinised, the same type of auditing and scrutinising should be carried out on the non-financial aspects of technology companies, including their security mechanisms and algorithms, to ensure they are operating responsibly. The Government should provide the appropriate body with the power to audit these companies, including algorithmic auditing, and we reiterate the point that the ICO’s powers should be substantially strengthened in these respects”. The committee also floats the idea that the Competition and Markets Authority considers conducting an audit of the operation of the advertising market on social media (given the risk of fake accounts leading to ad fraud)
- a requirement for tech companies to make full disclosure of targeting used as part of advert transparency. The committee says tech companies must also address the issue of shell corporations and “other professional attempts to hide identity in advert purchasing.
How the government will respond to the committee’s laundry list of recommendations for cleaning up online political advertising remains to be seen, although the issue of Kremlin-backed disinformation campaigns was at least raised publicly by the prime minister last year. Although Theresa May has been rather quieter on revelations about EU referendum-related data misuse and election law breaches.
While the committee uses the term “tech companies” throughout its report to refer to multiple companies, Facebook specifically comes in for some excoriating criticism, with the committee accusing the company of misleading by omission and actively seeking to impede the progress of the inquiry.
It also reiterates its call — for something like the fifth time at this point — for founder Mark Zuckerberg to give evidence. Facebook has provided several witnesses to the committee, including its CTO, but Zuckerberg has declined its requests he appear, even via video link. (And even though he did find time for a couple of hours in front of the EU parliament back in May.)
The committee writes:
We undertook fifteen exchanges of correspondence with Facebook, and two oral evidence sessions, in an attempt to elicit some of the information that they held, including information regarding users’ data, foreign interference and details of the so-called ‘dark ads’ that had reached Facebook users. Facebook consistently responded to questions by giving the minimal amount of information possible, and routinely failed to offer information relevant to the inquiry, unless it had been expressly asked for. It provided witnesses who have been unwilling or unable to give full answers to the Committee’s questions. This is the reason why the Committee has continued to press for Mark Zuckerberg to appear as a witness as, by his own admission, he is the person who decides what happens at Facebook.
…
Tech companies are not passive platforms on which users input content; they reward what is most engaging, because engagement is part of their business model and their growth strategy. They have profited greatly by using this model. This manipulation of the sites by tech companies must be made more transparent. Facebook has all of the information. Those outside of the company have none of it, unless Facebook chooses to release it. Facebook was reluctant to share information with the Committee, which does not bode well for future transparency. We ask, once more, for Mr Zuckerberg to come to the Committee to answer the many outstanding questions to which Facebook has not responded adequately, to date.
The committee suggests that the UK’s Defamation Act 2013 means Facebook and other social media companies have a duty to publish and to follow transparent rules — arguing that the Act has provisions which state that “if a user is defamed on social media, and the offending individual cannot be identified, the liability rests with the platform”.
“We urge the government to examine the effectiveness of these provisions, and to monitor tech companies to ensure they are complying with court orders in the UK and to provide details of the source of disputed content– including advertisements — to ensure that they are operating in accordance with the law, or any future industry Codes of Ethics or Conduct. Tech companies also have a responsibility to ensure full disclosure of the source of any political advertising they carry,” it adds.
The committee is especially damning of Facebook’s actions in Burma (as indeed many others have also been), condemning the company’s failure to prevent its platform from being used to spread hate and fuel violence against the Rohingya ethnic minority — and citing the UN’s similarly damning assessment.
“Facebook has hampered our efforts to get information about their company throughout this inquiry. It is as if it thinks that the problem will go away if it does not share informati
source https://techcrunch.com/2018/07/30/fake-news-inquiry-calls-for-social-media-levy-to-defend-democracy/
No comments:
Post a Comment