Facebook and Google should audit algorithms that boost fake news, say UK Lords

The global coronavirus pandemic has left governments, the tech industry and citizens reeling, not just from the devastating effects of the virus, but from the slew of misinformation that has accompanied it. How best to tackle the spread of false information is a subject of global debate, especially with regard to just how much responsibility the tech platforms hosting it bear.

In the UK, the House of Lords Democracy and Digital Technologies Committee published a report on Monday featuring 45 recommendations for the UK government to take action against the "pandemic of misinformation" and disinformation. Failing to take the threat seriously would undermine democracy, causing it to "decline into irrelevance," it said. 

During the outbreak, the threat of misinformation and disinformation has taken on a new urgency as conspiracy theories have flourished on online platforms. The worst of these have put people's health directly at risk by falsely endorsing dangerous cures or discouraging people from taking precautions against the virus. Across Europe, they've also resulted in damage to telecoms infrastructure when COVID-19 was wrongly linked to 5G.

The report examines the ways false information spread during the virus outbreak, and warned that misinformation was a crisis "with roots that extend far deeper, and are likely to last far longer than COVID-19." 

"We are living through a time in which trust is collapsing," said David Puttnam, the committee chair in a statement. "People no longer have faith that they can rely on the information they receive or believe what they are told. That is absolutely corrosive for democracy."

Key among the recommendations are requests to hold big platforms, specifically Google and Facebook, accountable for their "black box" algorithms that control what content is shown to users. These companies denying that their decisions in shaping and training algorithms resulted in harm was "plain wrong," the report says. 

Companies should be mandated to conduct audits of their algorithms, to show what steps they take to prevent them from discriminating, the report said. It also suggests increased transparency from digital platforms about content decisions so that people have a clear idea about the rules of online debate. 





Regulation: The Online Harms Bill

One of the report's primary recommendations was for the UK government to immediately publish its draft Online Harms Bill. The bill would regulate digital platforms like Google and Facebook, holding them accountable for harmful content and penalizing them when they failed to meet their obligations.

The progress of the bill has been slow, with a white paper published in May 2019, the government's initial response published in February this year and the full response, which was supposed to be published over the summer, delayed until the end of the year.  

The government wasn't able to confirm to the committee whether or not it would bring a draft bill to Parliament by the end of 2021. As a result, the bill might not come into effect until late 2023, or even 2024, the report said. During a briefing ahead of the report's publication, Lord Puttnam described the delay as "inexcusable."

"The challenges are moving faster than the government and the gap is getting larger and larger," he said. "Far from catching up, we're actually slipping behind."

The report detailed the ways in which Ofcom, which would be the designated online harms regulator, should be able to hold the companies accountable under legislation. It should have the power to fine digital companies up to 4 percent of their global turnover or force ISP blocking of serial offenders, it said. 

Online platforms are "not inherently ungovernable," it said, as it urged the government not to "flinch in the face of the inevitable and powerful lobbying of big tech."

The report looked specifically at the recent case in which Twitter chose to hide some of President Donald Trump's tweets that violated its policies, and criticized Facebook's decision not to follow suit. Lord Puttnam said that Twitter CEO Jack Dorsey had "badly wrongfooted Facebook."

That story is not over yet, he added, but he was optimistic that Twitter's decision to take action against the president when he violated the platform's rules might have a knock-on effect. 

"There's a sense that these large companies look at each other and when one makes a sensible shift in a sensible direction, the others feel very, constrained, very under pressure to make a similar shift," he said.

There have been many efforts across Europe and further afield to put pressure on big tech, not just to crack down on fake news, but also to pay more taxes and change their practices through antitrust decisions and privacy regulation. The success of these efforts so far is debatable, but Lord Puttnam and other committee members ultimately expressed their optimism that positive change would come to the tech industry.

If the government, which now has two months to respond to the report, embraces the committee's recommendations, it believes there is a chance that tech could support democracy and help restore public trust, instead of further undermining it. 


Comments

Popular posts from this blog