DCMS Select Committee report ‘Misinformation in the COVID-19 Infodemic’

DCMS Select Committee report ‘Misinformation in the COVID-19 Infodemic’

The Online Harms White Paper published in April 2019 proposed a duty of care on tech companies and an independent Online Harms Regulator. The DCMS Select Committee opened an inquiry in March this year calling for evidence to help understand the causes and impact of COVID-19 and how it can be tackled.

On 16 July, the Committee published a report ‘Misinformation in the COVID-19 Infodemic’ which sets out some recommendations.  On 5G specifically, the report mentions written evidence from BT stated that between 23 March and 23 April there were 30 separate attempts of sabotage on the UK’s digital infrastructure and around 80 attacks across sites operated by all four mobile networks, with 19 occurring near critical infrastructure such as fire, police and ambulance stations. EE personnel and subcontractors faced 70 separate incidents, including threats to kill and vehicles driven directly at staff. 

Key recommendations:

  • If the Bill is not ready, Government should publish draft legislation alongside the full consultation response this autumn.
  • The Committee calls for the new independent online harms regulator to be appointed now. It should have the power to go beyond ensuring that tech companies enforce their own policies and ensure that the policies themselves are adequate in addressing the harms faced by society. It should be able to impose significant fines for non-compliance and disrupt the activities of businesses that are not complying – and ultimately to ensuring custodial sentences are available as a sanction where required.
  • Government should bring forward a detailed process for deciding which harms are in scope for legislation. This process must always be evidence-led and subject to democratic oversight, rather than delegated entirely to the regulator. Legislation should also establish the differentiated expectations of tech companies for illegal content and ‘harmful but legal’.
  • Alongside developing its voluntary codes of practice for child sexual exploitation and abuse and terrorist content, Government should urgently work with tech companies to develop a voluntary code of practice to protect citizens from the harmful impacts of misinformation and disinformation in conjunction with academics, civil society and regulators. A well-developed code of practice for misinformation and disinformation should be world-leading and will prepare the ground for legislation in this area.
  • Government should consider how algorithmic auditing can be done in practice and bring forward detailed proposals in the final consultation response to the White Paper. Tech companies should have in place robust and transparent reporting systems and must produce clear and specific information to the public about how reports regarding content that breaches legislative standards, or a company’s own standards (where these go further than legislation), are dealt with, and what the response has been. The new regulator should also regularly test and audit each platform’s user reporting functions.
  • The regulator will need sight of comprehensive advertising libraries to see if and how advertisers are spreading misinformation through paid advertising or are exploiting misinformation or other online harms for financial gain. Legislation should also require advertising providers to provide directories of websites that they provide advertising for, to allow for greater oversight in the monetisation of online harms by third parties.
  • Features of the digital advertising market controlled by companies such as Facebook and Google must not undermine the ability of newspapers and others to produce quality content. Tech companies should be promoting trusted and authoritative journalistic sources to combat the spread of misinformation.
  • Government should also consider how regulators can work together to address any gaps between existing regulation and online harms and should do this in consultation with the Digital Regulation Cooperation Forum (ICO, Ofcom and CMA).