Transcription

House of CommonsDigital, Culture, Media andSport CommitteeDisinformation and‘fake news’: InterimReportFifth Report of Session 2017–19Report, together with formal minutes relatingto the reportOrdered by the House of Commonsto be printed 24 July 2018HC 363Published on 29 July 2018by authority of the House of Commons

The Digital, Culture, Media and Sport CommitteeThe Digital, Culture, Media and Sport Committee is appointed by the Houseof Commons to examine the expenditure, administration and policy of theDepartment for Digital, Culture, Media and Sport and its associated public bodies.Current membershipDamian Collins MP (Conservative, Folkestone and Hythe) (Chair)Clive Efford MP (Labour, Eltham)Julie Elliott MP (Labour, Sunderland Central)Paul Farrelly MP (Labour, Newcastle-under-Lyme)Simon Hart MP (Conservative, Carmarthen West and South Pembrokeshire)Julian Knight MP (Conservative, Solihull)Ian C. Lucas MP (Labour, Wrexham)Brendan O’Hara MP (Scottish National Party, Argyll and Bute)Rebecca Pow MP (Conservative, Taunton Deane)Jo Stevens MP (Labour, Cardiff Central)Giles Watling MP (Conservative, Clacton)The following Members were also members of the Committee during the inquiryChristian Matheson MP (Labour, City of Chester)PowersThe Committee is one of the departmental select committees, the powers of whichare set out in House of Commons Standing Orders, principally in SO No 152. Theseare available on the internet via www.parliament.uk.PublicationCommittee reports are published on the Committee’s website atwww.parliament.uk/dcmscom and in print by Order of the House.Evidence relating to this report is published on the inquiry publications page of theCommittee’s website.Committee staffThe current staff of the Committee are Chloe Challender (Clerk), Joe Watt (SecondClerk), Lauren Boyer (Second Clerk), Josephine Willows (Senior CommitteeSpecialist), Lois Jeary (Committee Specialist), Andy Boyd (Senior CommitteeAssistant), Keely Bishop (Committee Assistant), Lucy Dargahi (Media Officer) andJanet Coull Trisic (Media Officer).ContactsAll correspondence should be addressed to the Clerk of the Digital, Culture, Mediaand Sport Committee, House of Commons, London SW1A 0AA. The telephonenumber for general enquiries is 020 7219 6188; the Committee’s email address [email protected]

Disinformation and ‘fake news’: Interim Report1ContentsSummary 312Introduction and background 4Definition of ‘fake news’ 7How to spot ‘fake news’ 8Our recommendations in this Report 9The definition, role and legal responsibilities of tech companies 10An unregulated sphere 10Regulatory architecture 11The Information Commissioner’s Office 11The Electoral Commission 13Platform or publisher? 16Transparency 18Bots 19Algorithms 20Privacy settings and ‘terms and conditions’ 21‘Free Basics’ and Burma 22Code of Ethics and developments 23Monopolies and the business models of tech companies 3The issue of data targeting, based around the Facebook, GSR andCambridge Analytica allegations 2426Cambridge Analytica and micro-targeting 26Global Science Research 28Facebook 314Aggregate IQ (AIQ) 32The links between Cambridge Analytica, SCL and AIQ 33Political campaigning 37What is a political advert? 37Electoral questions concerning the EU Referendum 38Co-ordinated campaigns 38Leave.EU and data from Eldon Insurance allegedly used for campaigning work 405Russian influence in political campaigns 43Introduction 43Use of the data obtained by Aleksandr Kogan in Russia 44

26Disinformation and ‘fake news’: Interim ReportThe role of social media companies in disseminating Russian disinformation 45Leave.EU, Arron Banks, and Russia 47Foreign investment in the EU Referendum 49Catalonia Referendum 50Co-ordination between UK Departments and between countries 51SCL influence in foreign elections 53Introduction 53General 53St Kitts and Nevis 55Trinidad and Tobago 56Argentina 56Malta 56Nigeria and Black Cube 57Conclusion 587Digital literacy 60The need for digital literacy 60Why people connect on social media 60Content on social media 61Data on social media 61A unified approach to digital literacy 62Young people 62School curriculum 62Conclusions and recommendations 64Annex 74Formal minutes 77Witnesses 78Published written evidence 81List of Reports from the Committee during the current Parliament 87

Disinformation and ‘fake news’: Interim ReportSummaryThere are many potential threats to our democracy and our values. One such threat arisesfrom what has been coined ‘fake news’, created for profit or other gain, disseminatedthrough state-sponsored programmes, or spread through the deliberate distortion offacts, by groups with a particular agenda, including the desire to affect political elections.Such has been the impact of this agenda, the focus of our inquiry moved fromunderstanding the phenomenon of ‘fake news’, distributed largely through socialmedia, to issues concerning the very future of democracy. Arguably, more invasive thanobviously false information is the relentless targeting of hyper-partisan views, whichplay to the fears and prejudices of people, in order to influence their voting plans andtheir behaviour. We are faced with a crisis concerning the use of data, the manipulationof our data, and the targeting of pernicious views. In particular, we heard evidence ofRussian state-sponsored attempts to influence elections in the US and the UK throughsocial media, of the efforts of private companies to do the same, and of law-breaking bycertain Leave campaign groups in the UK’s EU Referendum in their use of social media.In this rapidly changing digital world, our existing legal framework is no longer fit forpurpose. This is very much an interim Report, following an extensive inquiry. A further,substantive Report will follow in the autumn of 2018. We have highlighted significantconcerns, following recent revelations regarding, in particular, political manipulationand set we out areas where urgent action needs to be taken by the Government and otherregulatory agencies to build resilience against misinformation and disinformation intoour democratic system. Our democracy is at risk, and now is the time to act, to protectour shared values and the integrity of our democratic institutions.3

4Disinformation and ‘fake news’: Interim Report1 Introduction and background1. In this inquiry, we have studied the spread of false, misleading, and persuasivecontent, and the ways in which malign players, whether automated or human, or bothtogether, distort what is true in order to create influence, to intimidate, to make money, orto influence political elections.2. People are increasingly finding out about what is happening in this country, in theirlocal communities, and across the wider world, through social media, rather than throughmore traditional forms of communication, such as television, print media, or the radio.1Social media has become hugely influential in our lives.2 Research by the Reuters Institutefor the Study of Journalism has shown that not only are huge numbers of people accessingnews and information worldwide through Facebook, in particular, but also through socialmessaging software such as WhatsApp. When such media are used to spread rumoursand ‘fake news’, the consequences can be devastating.33. Tristan Harris, Co-founder and Executive Director, at the Center for HumaneTechnology—an organisation seeking to realign technology with the best interests of itsusers—told us about the many billions of people who interact with social media: “There aremore than 2 billion people who use Facebook, which is about the number of conventionalfollowers of Christianity. There are about 1.8 billion users of YouTube, which is about thenumber of conventional followers of Islam. People check their phones about 150 times aday in the developed world.”4 This equates to once every 6.4 minutes in a 16-hour day. Thisis a profound change in the way in which we access information and news, one which hasoccurred without conscious appreciation by most of us.4. This kind of evidence led us to explore the use of data analytics and psychologicalprofiling to target people on social media with political content, as its political impacthas been profound, but largely unappreciated. The inquiry was launched in January 2017in the previous Parliament, and then relaunched in the autumn, following the June 2017election. The inquiry’s Terms of Reference were as follows:12345 What is ‘fake news’? Where does biased but legitimate commentary shade intopropaganda and lies? What impact has fake news on public understanding of the world, and also onthe public response to traditional journalism? If all views are equally valid, doesobjectivity and balance lose all value? Is there any difference in the way people of different ages, social backgrounds,genders etc use and respond to fake news? Have changes in the selling and placing of advertising encouraged the growth offake news, for example by making it profitable to use fake news to attract morehits to websites, and thus more income from advertisers?5News consumption in the UK: 2016, Ofcom, 29 June 2017Tristan Harris, Co-founder and Executive Director, Center for Humane Technology, Q3147The seventh annual Digital News Report, by the Reuters Institute for the Study of Journalism, University ofOxford was based on a YouGov online survey of 74,000 people in 37 countries.Tristan Harris, Q3147Terms of reference, Fake News inquiry, DCMS Committee, 15 September 2017.

Disinformation and ‘fake news’: Interim Report55. We will address the wider areas of our Terms of Reference, including the role ofadvertising, in our further Report this autumn. In recent months, however, our inquirydelved increasingly into the political use of social media, raising concerns that we wishto address immediately. We had asked representatives from Facebook, in February 2018,about Facebook developers and data harvesting.6 Then, in March 2018, Carole Cadwalladrof The Observer,7 together with Channel 4 News, and the New York Times, publishedallegations about Cambridge Analytica (and associated companies) and its work withGlobal Science Research (GSR), and the misuse of Facebook data.8 Those allegationsput into question the use of data during the EU Referendum in 2016, and the extent offoreign interference in UK politics. Our oral evidence sessions subsequently focussed onthose specific revelations, and we invited several people involved to give evidence. Theallegations highlighted both the amount of data that private companies and organisationshold on individuals, and the ability of technology to manipulate people.6. This transatlantic media coverage brought our Committee into close contact withother parliaments around the world. The US Senate Select Committee on Intelligence, theUS House of Representatives Permanent Select Committee on Intelligence, the EuropeanParliament, and the Canadian Standing Committee on Access to Information, Privacy,and Ethics all carried out independent investigations. We shared information, sometimeslive, during the hearings. Representatives from other countries, including Spain, France,Estonia, Latvia, Lithuania, Australia, Singapore, Canada, and Uzbekistan, have visitedLondon, and we have shared our evidence and thoughts. We were also told about thework of SCL Elections—and other SCL associates, including Cambridge Analytica—setup by the businessman Alexander Nix; their role in manipulating foreign elections; andthe financial benefits they gained through those activities. What became clear is that,without the knowledge of most politicians and election regulators across the world,not to mention the wider public, a small group of individuals and businesses had beeninfluencing elections across different jurisdictions in recent years.7. We invited many witnesses to give evidence. Some came to the Committee willingly,others less so. We were forced to summon two witnesses: Alexander Nix, former CEO ofCambridge Analytica; and Dominic Cummings, Campaign Director of Vote Leave, thedesignated Leave campaign group in the EU Referendum. While Mr. Nix subsequentlyagreed to appear before the Committee, Dominic Cummings still refused. We were thencompelled to ask the House to support a motion ordering Mr Cummings to appear beforethe Committee.9 At the time of writing he has still not complied with this Order, and thematter has been referred by the House to the Committee of Privileges. Mr Cummings’contemptuous behaviour is unprecedented in the history of this Committee’s inquiries andunderlines concerns about the difficulties of enforcing co-operation with Parliamentaryscrutiny in the modern age. We will return to this issue in our Report in the autumn, andbelieve it to be an urgent matter for consideration by the Privileges Committee and byParliament as a whole.6789Monika Bickert, Q389In June 2018, Carole Cadwalladr won the Orwell journalism prize, for her investigative work into CambridgeAnalytica, which culminated in a series of articles from March 2018.Harry Davies had previously published the following article Ted Cruz using firm that harvested data on millionsof unwitting Facebook users, in The Guardian, on 11 December 2015, which first revealed the harvesting of datafrom Facebook.Following the motion being passed, Dominic Cummings did not appear before the Committee. The matter wasthen referred to the Privileges Committee on 28 June 2018.

6Disinformation and ‘fake news’: Interim Report8. In total, we held twenty oral evidence sessions, including two informal backgroundsessions, and heard from 61 witnesses, asking over 3,500 questions at these hearings. Wereceived over 150 written submissions, numerous pieces of background evidence, andundertook substantial exchanges of correspondence with organisations and individuals.We held one oral evidence session in Washington D.C. (the first time a Select Committeehas held a public, live broadcast oral evidence session abroad) and also heard fromexperts in the tech field, journalists and politicians, in private meetings, in Washingtonand New York. Most of our witnesses took the Select Committee process seriously, andgave considered, thoughtful evidence, specific to the context of our inquiry. We thankwitnesses, experts, politicians, and individuals (including whistle-blowers) whom we metin public and in private, in this country and abroad, and who have been generous withtheir expertise, knowledge, help and ideas.10 We also thank Dr Lara Brown and her teamat the Graduate School of Political Management at George Washington University, forhosting the Select Committee’s oral evidence session in the US.9. As noted above, this is our first Report on misinformation and disinformation.Another Report will be published in the autumn of 2018, which will include moresubstantive recommendations, and also detailed analysis of data obtained from theinsecure AggregateIQ website, harvested and submitted to us by Chris Vickery, Directorof Cyber Risk Research at UpGuard.11 Aggregate IQ is one of the businesses involved mostclosely in influencing elections.10. Since we commenced this inquiry, the Electoral Commission has reported on seriousbreaches by Vote Leave and other campaign groups during the 2016 EU Referendum;the Information Commissioner’s Office has found serious data breaches by Facebookand Cambridge Analytica, amongst others; the Department for Digital, Culture, Mediaand Sport (DDCMS) has launched the Cairncross Review into press sustainability in thedigital age; and, following a Green Paper in May, 2018, the Government has announcedits intention to publish a White Paper later this year into making the internet and socialmedia safer. This interim Report, therefore, focuses at this stage on seven of the areascovered in our inquiry:1011 Definition of fake news, and how to spot it; Definition, role and legal liabilities of social media platforms; Data misuse and targeting, focussing on the Facebook/Cambridge Analytica/AIQ revelations; Political campaigning; Foreign players in UK elections and referenda; Co-ordination of Departments within Government; Digital literacy.Our expert adviser for the inquiry was Dr Charles Kriel, Associate Fellow at the King’s Centre for StrategicCommunications (KCSC), King’s College London. His Declaration of Interests are: Director, Kriel.Agency, a digitalmedia and social data consulting agency; Countering Violent Extremism Programme Director, Corsham Institute,a civil society charity; and Cofounder and shareholder, Lightful, a social media tool for charities.In the early autumn, we hope to invite Ofcom and the Advertising Standards Authority to give evidence, andto re-invite witnesses from the Information Commissioner’s Office and the Electoral Commission, and this oralevidence will also inform our substantive Report.

Disinformation and ‘fake news’: Interim Report7Definition of ‘fake news’11. There is no agreed definition of the term ‘fake news’, which became widely used in2016 (although it first appeared in the US in the latter part of the 19th century).12 ClaireWardle, from First Draft, told us in our oral evidence session in Washington D.C. that“when we are talking about this huge spectrum, we cannot start thinking about regulation,and we cannot start talking about interventions, if we are not clear about what we mean”.13It has been used by some, notably the current US President Donald Trump, to describecontent published by established news providers that they dislike or disagree with, but ismore widely applied to various types of false information, including: Fabricated content: completely false content; Manipulated content: distortion of genuine information or imagery, for examplea headline that is made more sensationalist, often popularised by ‘clickbait’; Imposter content: impersonation of genuine sources, for example by using thebranding of an established news agency; Misleading content: misleading use of information, for example by presentingcomment as fact; False context of connection: factually accurate content that is shared with falsecontextual information, for example when a headline of an article does notreflect the content; Satire and parody: presenting humorous but false stores as if they are true.Although not usually categorised as fake news, this may unintentionally foolreaders.1412. In addition to the above is the relentless prevalence of ‘micro-targeted messaging’,which may distort people’s views and opinions.15 The distortion of images is a relatedproblem; evidence from MoneySavingExpert.com cited celebrities who have had theirimages used to endorse scam money-making businesses, including Martin Lewis, whoseface has been used in adverts across Facebook and the internet for scams endorsingproducts including binary trading and energy products.16 There are also ‘deepfakes’, audioand videos that look and sound like a real person, saying something that that person hasnever said.17 These examples will only become more complex and harder to spot, the moresophisticated the software becomes.13. There is no regulatory body that oversees social media platforms and writtencontent including printed news content, online, as a whole. However, in the UK, underthe Communications Act 2003, Ofcom sets and enforces content standards for television121314151617Fake News: A Roadmap, NATO Strategic Centre for Strategic Communications, Riga and King’s Centre forStrategic Communications (KCSE), January 2018.Claire Wardle, Q573Online information and fake news, Parliamentary Office of Science and Technology, July 2017, box 4.Also see First Draft News, Fake news. It’s complicated, February 2017; Ben Nimmo (FNW0125); Full Fact,(FNW0097)Micro-targeting of messages will be explored in greater detail in Chapter 4.MoneySavingExpert.com (FKN0068)Edward Lucas, Q881

8Disinformation and ‘fake news’: Interim Reportand radio broadcasters, including rules relating to accuracy and impartiality.18 On 13July 2018