House debates

Monday, 11 September 2023

Private Members' Business

Freedom of Speech

6:06 pm

Photo of Zoe DanielZoe Daniel (Goldstein, Independent) Share this | Hansard source

Fake news is, of course, not new. People have been manipulated by false information throughout history. But the digital world has promoted a post-truth era where facts are now a matter of opinion. As a community, we urgently need to decide whether we want to have a measure of control over the information that we consume while protecting freedom of expression and avoiding government overreach. The government's proposed laws on misinformation and disinformation are therefore well timed. Their form, however, must be carefully calibrated.

It's important to distinguish between misinformation and disinformation. The former refers to false or inaccurate information—getting the facts wrong—the latter to false information which is deliberately intended to mislead. In the case of the digital world, both spread fast, often in an uncontrolled way, and cause harm. In some cases, this can be outright dangerous, as was the case during COVID, when false treatments were promoted online. During Hurricane Ian in the US in 2022, Russian news outlet Sputnik, in what was interpreted as an attempt to undermine public trust in the authorities, promoted a narrative that the US government had abandoned storm victims.

I spent several years reporting on Donald Trump's presidency in the United States. My observation is that Trump's deliberate seeding of disinformation throughout 2020 that the election was going to be rigged triggered the eventual storming of the US Capitol by patriots in January 2021. This untold damage to US democracy continues to linger, with millions of Americans continuing to believe the election was stolen, and this may yet deliver Trump a return to the White House in 2024.

Social cohesion, public health and safety, and political stability are all at risk from the rapid and uncontrolled spread of mis- and disinformation. Right now, according to a survey by Reset.Tech Australia, social media platforms are not living up to their own guidelines on fake news or to the Australian Code of Practice on Disinformation and Misinformation, with claims that the Voice referendum would be invalid or illegal still available for all to see. As La Trobe University states in its 2021 Fighting Fake News report, traditional media can also contribute to the problem through the amplification of fake news. But, as Associate Professor Andrea Carson asks: what's the best way to manage it? How can it be done without government overreach, which risks the freedom and diversity of expression necessary in healthy democracies?

With penalties elevated over voluntary participation, the government's bill would bring us closer to the EU-style model of mandatory co-regulation. This approach sits at the halfway mark of approaches to this problem—on the lower end, non-regulatory approaches like digital literacy and fact checking; on the upper end, sometimes draconian anti-fake-news laws that can be misused by governments. Russia, for example, legislated to suppress media and political dissent about its war in Ukraine.

Co-regulation, like that proposed in this bill, is not the same thing. Indeed, the core thesis of this proposal was coalition policy, and previously had bipartisan support. Ironically, since then the bill itself has arguably been the subject of disinformation, in the form of a knee-jerk reaction about freedom of expression. What those expressing concern need to answer is this: do they believe mis- and disinformation are a threat to democracy; and, if not this approach, what? This is absolutely a reasoned debate that is worth having.

As a former journalist, I believe wholeheartedly that freedom of expression is central to a healthy democracy. At the centre of the bill under discussion today is the fact that it is not the government, but the platforms, that would remain responsible for their content. New powers would allow ACMA access to a platform's procedures to see how it deals with online mis- and disinformation that can cause serious harm, and to request changes to processes, not content. ACMA would not be given arbitrary powers to determine what content is true or false nor to have posts removed. Australia has had a voluntary code along these lines since 2021, but not everyone opts in. Platform participation with registered codes would be compulsory and attract warnings, fines and penalties for noncompliance, but the definitions of mis- and disinformation in this bill may be too broad, and excluding government information and online news media content is also questionable. This is an uncomfortable but necessary conversation, and any legislation must be carefully calibrated to help rebuild, not further erode, public trust. I say: don't bin the bill; fix the bill.

Comments

No comments