Monday, 11 September 2023
Private Members' Business
Freedom of Speech
The World Economic Forum has ranked the spread of misinformation as among the world's top risks. There are numerous reasons we need to act on misinformation and disinformation. Both have real-world dangers. These include consequences for public health and safety, for social cohesion and for electoral integrity—and, therefore, for democracy itself. False, misleading and deceptive content has always been with us. The difference now is that where a myth used to be spread amongst just a few people, social media has magnified this risk and can spread it a thousandfold or more. As Tim Berners-Lee, the father of the internet said:
While the web has created opportunity, given marginalized groups a voice, and made our daily lives easier, it has also created opportunity for scammers, given a voice to those who spread hatred, and made all kinds of crime easier to commit.
We saw during the COVID pandemic the deadly consequences of misinformation and disinformation, as people turned to treatments such as drinking or injecting bleach or drinking alcohol based cleaning products. In the US, misinformation had severe public health consequences, often undermining efforts to contain the spread and impact of COVID-19, for example. Misinformation about vaccines continues to pose significant health risks. When he was head of the NHS in the UK, Simon Stevens said misinformation from antivaxxers on social media had fuelled a tripling of measles cases in the country.
The cost of misinformation and disinformation in the electoral sphere is incalculable. A Brookings Institute paper found that misinformation is eroding the public's confidence in democracy. Meanwhile, the Australian Human Rights Commission has noted that misinformation is one of the three particular risks to democracy and human rights in Australia.
Claims that the proposed changes amounts to censorship misunderstand the operation of the bill, and the fact that digital platforms are responsible for the content on their platforms. Social media and digital platforms—private, overseas based companies—already take down misinformation and disinformation content, at scale, every day. Some platforms have signed up to a voluntary self-regulatory code of practice that was developed by industry to respond to the threat of misinformation and disinformation. But voluntary codes have shortfalls. Not all platforms participate, and some cherry-pick the areas of the code that they will respond to. That's why a mandatory, co-regulatory approach is important.
The definition of misinformation covers content which is false, misleading or deceptive, which is likely to cause, or contribute to, serious harm, and which is provided on a digital service and spread at scale. It includes content disseminated with intent to deceive, including purposefully or maliciously disseminated information. The definition sets a high bar because of its serious harm threshold. The government is not ruling what information is false. The Australian Communications and Media Authority would not have powers to determine what content is true or false or direct that specific posts be removed. Digital platforms will continue to be responsible for the content on their services. The content of private messages, authorised electoral communications, parody and satire, and news media remains outside the scope of the proposed changes.
The bill is about transparency, and, importantly, it is about systems and processes. ACMA would be able to check how a platform deals with online misinformation and disinformation that can cause serious harm and would be able to request changes to processes. The bill empowers the regulator to require greater transparency from big tech, to encourage compliance with industry codes and to require systemic improvements by industry where necessary, such as in relation to complaints-handling processes. As the former chair of the ACCC Rod Sims said, governments face two choices on these vital but difficult issues: do nothing and leave it to the platforms to decide whether to do anything at all or seek to intervene in some way.
While the shadow minister has nothing more than a three-word slogan—'In the bin'—the Albanese government will not resile from holding big tech to account and keeping Australia safe online.