Telegraph e-paper

On watch Teams tackling disinformation

The Daily Telegraph, CounterDisinformation Unit CounterDisinformation Policy Forum Rapid Response Unit The Telegraph The Spectator The Spectator

IT WAS a drizzly January day in 2021 and Britain was still in lockdown when the meeting was brought to order. The Department for Culture, Media and Sport (DCMS) had assembled 21 men and women, with the purpose – the chairman explained – of “addressing the threat posed by Covid-19 mis- and disinformation”.

It was already a “priority” and would remain one as the Covid-19 vaccination programme rolled out across Britain, she explained, “given the clear risk posed by anti-vaccination narratives”.

The vaccination programme had just begun and the Government was relying on it to bring the country out of lockdown.

Those present at the meeting included senior executives working for Google, Facebook and Twitter, as well as the BBC and Ofcom, the broadcasting regulator.

There were also half a dozen academics and representatives of fact-checking organisations and lobby groups such as Full Fact and the Centre for Countering Digital Hate. Together they formed the “Counter-Disinformation Policy Forum”.

Civil servants provided an update on how the national vaccine rollout was being received. There was “further work to do with decreasing vaccine hesitancy amongst black and minority ethnic communities”, and the Government’s last-minute decision to change the interval between doses was “starting to become an area for worries”.

This information is recorded in a memo of the meeting, obtained by the Big Brother Watch campaign group under Freedom of Information laws, and passed to this newspaper.

It is also clear from the disclosure at least some were wary that their efforts to tackle misinformation should not tip over into censorship.

There should be an “emphasis on importance of freedom of expression”, the document states, and on “transparency”. Delegates were clear that there should be “support and scope for greater efforts towards transparency and publicity”.

Somewhat ironically, about a third of the six-page disclosure is so heavily redacted, it comprises pages of black.

A section marked “Key points” is entirely blacked out. So are the names of the individual attendees, other than three civil servants.

We know that they included Loughborough University’s Andrew Chadwick, and Will Moy, the then chief executive of Full Fact – but only because of statements they have made elsewhere.

At the Policy Forum’s next meeting, in March 2021, ethical principles were no longer the forum’s priority. Another memo says the DCMS proposed prioritising other topics for the remaining Policy Forum sessions.

When shown the heavily redacted text by Mr Moy was aghast. On behalf of Full Fact, he said: “We do not believe that it was necessary or helpful to black out the notes of the meetings in this way.

“We recognise that not all of the work defending against these threats can be public but the Government can and should be more open.”

There weren’t many more meetings of the Policy Forum. The group, which despite the claims of transparency with the public kept a low profile, was wrapped up in June 2021 after a mere six months. But according to parliamentary disclosures, the Government had other measures in place to tackle the problem of disinformation.

Chief among them was the CounterDisinformation Unit (CDU), a secretive organisation run out of the DCMS, and led by Alex Aiken, the “executive director of government communications”.

The CDU started life in 2019 when its job was to tackle “disinformation” – that is, information that is intentionally and deliberately misleading – related to the European and general elections.

It was “stood up” for the third time in March 2020 as it became clear that the virus that had spread quickly in Wuhan was threatening to do the same in the UK. Its remit was expanded to include “identifying and responding to harmful misinformation relating to Covid-19” – that is, false information that is “inadvertently spread”, according to Caroline Dinenage, the former digital minister.

It was the first example of mission creep, but others would follow.

What the unit meant by “responding” was varied. In some cases it involved an online rebuttal. But in others, the DCMS used its position with the social media companies as what it called a “trusted flagger” to fast-track a request for content to be removed.

Ms Dinenage told a committee of MPs in 2020 that “where potentially harmful content is identified”, the CDU will “flag that content to the platform to ensure it can be swiftly reviewed and acted on”. She added that the Government does not “mandate the removal of any content”. But it is arguable that the mere fact that the requests come from a ministerial office put pressure on the social media companies to heed them.

Even Meta’s own “oversight board” – an independent group that reviews moderation decisions on Facebook and Instagram – acknowledges that there is a lack of transparency around government requests for action.

Remarkably little is known about the CDU’s activities beyond its function. It has not revealed how many removal requests it has made. Meanwhile, the DCMS has refused to disclose how many staff the unit has, or how much money is spent on it.

However, emails obtained by Big Brother Watch show that they are frequently in touch with social media giants. A Twitter executive told a special adviser to Matt Hancock, the then health secretary, in March 2020: “We’re also speaking regularly with the DCMS disinformation unit.”

Leaked WhatsApp messages show that over the months that followed, the health secretary discussed the problem of anti-vaccine misinformation with Sir Nick Clegg, the former deputy prime minister, who was then vice-president ‘The very concept of wrong information dictated by a central authority is open to abuse and should be considered far more critically, lest we mirror Chinesestyle censorship’

A cross-Whitehall team set up in 2019 to tackle “disinformation” around the European election. It has been reconvened several times over the last four years – again in 2020 during the Covid-19 pandemic.

Its remit was expanded during the pandemic to include responding to “harmful misinformation” relating to Covid-19 alongside its routine work of analysing disinformation. of global affairs at Meta, Facebook’s owner. In November 2020, Mr Hancock wrote to Sir Nick in America, mid-way through a “roundtable” meeting that the Government was holding with UK executives from Facebook and other technology companies.

“I’m just on a Zoom about tackling anti-vax with [Culture Secretary] Oliver Dowden – obviously vital,” he wrote. “Your team have been working really well with the department and the advertising ban is great – but we need to have a timeframe for removal of antivax material and how do [sic] to demonetise.”

Sir Nick promised: “I’ll look into this.” A month later he sent Mr Hancock another direct message: “Matt – we’re announcing further changes today (basically we’ll now remove false claims – debunked by public health experts – made about authorised/ licences vaccines).”

In the background, the CDU was working alongside the Cabinet Office’s now defunct Rapid Response Unit, which monitored social media and tracked the way information was being shared online so it could make rebuttals if needed. That unit also had so-called “trusted flagger” powers with tech companies and requested the removal of six posts on social media sites in April 2020. It is not clear which platforms they were, but all the posts disappeared, whether they were removed by the platform or the people who posted them.

The specific details of what was taken down have only been made public in one example. The Government requested urgent attention on a Facebook post purporting to come from a Randox delivery driver dropping off boxes of Covid tests to NHS hospitals. The driver posted a picture of boxes of the test kits and their delivery schedule in an update only visible to his friends. Somehow the Government saw it and told Facebook “we would like this removed urgently”.

In the end, the person who posted it deleted the account before Facebook was required to take action. Although this particular example may have been of little consequence, critics

A group of social media companies, fact-checking organisations and academics, assembled ahead of the Covid-19 vaccine rollout with the aim of combatting mis- and disinformation.

Members included Facebook,

Google, the BBC Trusted News Initiative, YouTube, Twitter, Ofcom, the Global Disinformation Index, the Center for Countering Digital Hate and a number of university professors.

A team that was within the Cabinet Office that monitored news and online content that was considered to be misinformation. This was then flagged with the CounterDisinformation Unit. It was wound up in July 2022. During its pilot in 2018 the service spotted “false narratives” relating to the chemical weapons attacks in Syria. The unit ensured those using search terms on air strikes were presented with fact. of the Government’s covert monitoring activities are concerned about a bigger issue at stake.

And it is one that looms larger when considering the kinds of content these little-known units are monitoring.

Government contracts suggest that much of the CDU’s work is carried out with the help of AI firms, scraping the internet for statements that may count as mis- or disinformation.

The DCMS spent £114,000 with a firm called Disinformation Index at the start of the pandemic and has a contract worth more than £1.2 million with Logically, a firm headquartered in Yorkshire, which claims to use AI to “uncover and address” misinformation and disinformation online. Publiclyavailable contract information suggests that the CDU’s monitoring programme continued until at least April 2023, and that it included helping to “build a comprehensive picture of potentially harmful misinformation and disinformation”.

Comprehensive is an apt word. Logically’s literature says it “ingests material from more than 300,000 media sources and all public posts on major social media platforms”. Documents obtained under data laws paint a disturbing picture of the kinds of material that it has monitored for the Government’s CDU.

In regular reports entitled “Covid-19 Mis/Disinformation Platform Terms of Service”, Logically scooped up posts by respected scientists questioning lockdown or arguing against the mass vaccination of children against Covid-19.

They also logged comments made by Silkie Carlo, the director of Big Brother Watch, on Talk TV at the end of 2021, objecting to vaccine passports and branding proposals as “a vision of checkpoint Britain”.

Other reports received by the CDU logged information about David Davis, the Conservative MP, noting him as “highly critical of the Government, with the majority of comments criticising Imperial College and blaming [redacted] personally for lockdown”.

The disclosure does not link to his specific comment, but it came five days after Mr Davis had co-written a piece for criticising modelling by Neil Ferguson, the Imperial College London scientist.

These examples are quite removed from the original aim set out by the Policy Forum on that drizzly January day: to address the “threat posed by Covid-19 mis- and disinformation”.

According to Ms Carlo, there has been huge “mission creep”, and we have arrived at a situation where the Government is effectively policing opinions it disagrees with as “false” information.

She said: “Whilst everyone would expect the Government and tech giants to act against foreign hostile disinformation campaigns, we should be incredibly cautious about these powers being turned inwards to scan, suppress and censor the lawful speech of Brits for wrongthink, as is shockingly the case right now. The very concept of ‘wrong information’ dictated by a central authority is open to abuse and should be considered far more critically, lest we mirror Chinese-style censorship.”

A spokesman for Mr Hancock said the information was in the public domain and directed readers to buy a copy of his book.

A BBC spokesman said the broadcaster attended the CounterDisinformation Policy Forum in an observer-only capacity. firms were pressured to quash dissent.

As an editor, I saw this phenomenon early on. When an article or a video is published, we at track how many views it gets. During lockdown, voices of dissent mysteriously came back with very low figures. Usually a sign of being “shadow banned” – the journalism is deemed (usually by a bot) as problematic, so it is not promoted. Sometimes a video is “demonetised” so no advertising appears alongside it. Sometimes, it’s censored outright.

When David Davis criticised lockdown in a House of Commons speech, YouTube took his video down. When commissioned Oxford’s Carl Heneghan and Tom Jefferson, to write about the evidence behind masks, Facebook labelled it “false information”. Why? What aspect

Molly Kingsley is a former lawyer and journalist who founded the children’s campaign group

UsForThem after the beginning of the pandemic.

As the virus tore through Britain, Ms

of what they wrote was false? Even now, Facebook refuses to explain. It is not regulated, so doesn’t need to. It has all the power, no transparency – and no accountability.

This combination makes Facebook such a tempting target for government. A quiet word here, a tweak of the algorithm there – and ministers can control the news agenda in a way no newspaper would ever allow. Matt Hancock’s WhatsApp messages show that politicians knew how to pressure Facebook: they’d just threaten regulation. And with Nick Clegg now occupying such a powerful position in Facebook, as Zuckerberg’s helper, they had an easy contact.

Lord Bethell, a health minister, messaged Hancock to suggest a line that would focus Clegg’s attention. “People are being confused by what

News

en-gb

2023-06-03T07:00:00.0000000Z

2023-06-03T07:00:00.0000000Z

https://dailytelegraph.pressreader.com/article/281629604657831

Daily Telegraph