The Covid-19 pandemic has dominated the global news agenda for weeks now. ‘Traditional’ news sources in the UK have noted a seismic shift in audience numbers. Social media sites, like Facebook and Twitter, took decisive action to effectively shut down the spread of misinformation in the interests of public safety. In doing so, they have proved that it can be done. Which raises the far more concerning question: if they can do it, why didn’t they do it before?

‘Fake news’ has dominated the agenda for months, if not years. Social media has been the main driver for this: the algorithms are built to promote content that generates the most engagement. In other words, the posts that get the most clicks and shares are the ones that spread the furthest (and fastest). The content itself is irrelevant – no one is checking the facts. If it’s popular, it gets promoted.

This didn’t seem to bother anyone too much when we were just dealing with elections. Perhaps this is because so many people think that politics is dull. Nigel Farage in a pub, with a pint, giving his opinion on immigration is – sadly – more interesting than Philip Hammond talking about the triple-lock pension.

However, when the world faces a global pandemic it’s a different case. People are, in fact, interested in the facts. And when they want facts, they turn to trusted sources. More than 9 million people are tuning into the BBC Six O’Clock News – viewing figures that haven’t been seen since Christmas Day in 2008. In more usual times, the figures sit somewhere around 4 million. And it’s not just the BBC seeing a surge in viewing figures; ITV has confirmed a 32% rise in daytime viewing figures and Channel 4 revealed that its audience has doubled since the pandemic began.

Make no mistake, though, that Facebook (which also owns Instagram) and Twitter have both played a central role in stopping the spread of false information about the Covid-19. Right at the very start of the outbreak, the Government pulled in the top bosses from various social media and technology sites and asked them directly to ensure misinformation was stopped as much as possible. They agreed. Facebook in particular said it would focus on “posts that are designed to discourage treatment”, including posts about false cures.

It’s widely accepted that Facebook and Twitter have been broadly successful. The majority of misinformation is being shared on WhatsApp (also owned by Facebook) – and that is almost impossible to police, given the encryption and private nature of the app. WhatsApp presents its own problem. But the fact that Facebook can, and has, effectively shut down misinformation about the coronavirus demonstrates that it can be done.

Elections, of course, are a concern. But this also applies to other areas such as cyber bullying and stalking, self-harm and radicalisation to name just a few. Once this is over, and we get back to ‘normal’ (whatever that looks like), social media platforms are going to face some very tough questions. And, quite frankly, it’s about time.