House debates

Tuesday, 16 March 2021

Bills

Online Safety Bill 2021, Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021; Second Reading

12:04 pm

Photo of Tim WattsTim Watts (Gellibrand, Australian Labor Party, Shadow Assistant Minister for Communications and Cyber Security) Share this | Hansard source

I rise to speak on the Online Safety Bill 2021 and the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021. Together, they seek to create a new online safety framework for Australians, 'a modern, fit-for-purpose regulatory framework that builds on the strengths of the existing legislative scheme for online safety'. Labor have a strong record, in government and in opposition, of supporting online safety measures for Australians, and we support the intent and the approach of these bills. From opposition we've sought to take a constructive approach to these bills. We support the measures in these bills to consolidate, update and enhance online safety laws for Australians.

For many years, Australians have been protected by laws to support online safety, and it's important that these laws are kept up to date, particularly in a sector as fast-moving as the technology and internet sector. To a large degree, these bills represent regulatory housekeeping by consolidating various online safety laws that have evolved over time into one bill, with some updates in response to Lynelle Briggs's independent review of online safety laws, which reported to the government in October 2018. As the report of the Briggs review notes:

Australia was one of the first countries to appreciate the threat to child safety afforded by the internet through excellent early work by then Australian Broadcasting Authority (ABA) which led to amendments to the Broadcasting Services Act in 1999 that established the legislative framework for online content co-regulation in Australia.

The upshot was to extend the co-regulatory system for broadcasting to online content, with Australian content assessed and taken down if found to be non-compliant with national classification requirements.

…   …   …

The sorts of prohibited content the co-regulatory system seeks to constrain are illegal material such as child sexual abuse material, extremely violent and disturbing pornography, extremist propaganda, incitement to terrorism, and games that victimise and abuse children or encourage illegal activity. It also seeks to restrict access to content that may be suitable for adults, but not children…

This is the overarching framework that has been in place for some decades now.

Over 20 years later, the laws still sit under the Broadcasting Services Act, which is an act for the regulation of broadcasting services—television and radio services—rather than online safety. However, as the Briggs review also notes:

The Enhancing Online Safety for Children Act was introduced in 2015 and renamed the Enhancing Online Safety Act in 2017 when its coverage was extended to certain adults … experiencing image-based abuse.

That act sets the arrangements for the e-safety office and allows the eSafety Commissioner to administer the cyberbullying complaints scheme, which investigates serious cyberbullying of children; and the image based abuse scheme, which provides a reporting investigation mechanism for the non-consensual sharing of intimate images.

Labor supports the aim of consolidating online safety laws into a new framework. This bill retains and replicates the existing provisions of the Enhancing Online Safety Act 2015 that are working well to protect Australians from online harms, such as the non-consensual sharing of intimate images scheme. These bills reflect a modernised online content scheme to replace the schemes that are currently in schedules 5 and 7 of the Broadcasting Services Act to address harmful content like refused-classification material.

These bills update various other elements, including setting new regulatory benchmarks—for example, by broadening the cyberbullying scheme to capture harms occurring on services other than social media; reducing the time frame for service providers to respond to a removal notice from the eSafety Commissioner, from 48 hours to 24 hours; bringing providers of app distribution services—app stores and internet search engine services—clearly into the remit of the new online content scheme; and establishing a specific and targeted power for the eSafety Commissioner to request or require ISPs to disable access to material depicting, promoting, inciting or instructing abhorrent violent conduct for time limited periods in crisis situations, reflecting the industry's call for government leadership on this issue.

Labor also supports the new elements of these bills that go beyond regulatory housekeeping, including the creation of a complaints-based removal notice scheme for cyberabuse being perpetrated against an Australian adult; and the articulation of a core set of basic online safety expectations to improve and promote online safety for Australians.

However, we are concerned about a number of aspects of these bills, and we have sought to be constructive in the way that we have expressed these concerns. Firstly, there is the government's delay and mismanagement of the process of getting a bill for a new online safety act before the parliament here today, which has substantive consequences. Secondly, there is the government's inability, after all of this time, to address key stakeholder concerns about serious, important and legitimate issues enlivened by these bills.

The safety of Australians online is of real importance, and Labor will work with the government to iron out these concerns in these bills in time for the debate on this bill in the Senate. But, in the meantime, Labor will not oppose these bills in the House of Representatives, and we will support passage through this place on the understanding that government amendments will be forthcoming. We have been in good-faith conversations with the government, and we expect those good-faith conversations to result in further changes.

I just want to articulate a little bit about the delay and mismanagement in the process that has brought the bill to the House today. The idea of an online safety bill has been around for 2½ years, since the Briggs review in October 2018 recommended a single up-to-date online safety act. Now the government has gotten around to introducing a bill to address this recommendation. But for almost two years the government has been spruiking this new online safety act in press conferences and media announcements as if it already existed. It's been a bit of a catch-all solution. In the lead-up to the May 2019 federal election, the Morrison government promised to introduce a new online safety act. In September 2019, the minister for communications spruiked the new online safety act in answer to questions about what the government was doing to keep Australians safe online, including in relation to the rise of right-wing extremism, online hate speech and racism in Australia following the Christchurch terrorist atrocity. A year later, in September 2020, the minister again spruiked the still non-existent online safety act in response to questions about what the government was doing to curb graphic content on social media platforms in the wake of a self-harm video on Facebook and TikTok. The minister's October 2020 op-ed on this topic kept the promise of a new online safety act alive, while Labor had been asking where it was, with the department at Senate estimates last year putting the delay down to 'pressures on drafting resources'. That's legislative drafting, not media release drafting. There seems to be an unlimited supply of that in the Morrison government.

After all this time, the government still doesn't quite have it ready to go. During the Senate inquiry into the provisions of this bill, when asked about key operational aspects of this bill, such as the novel adult cyberabuse scheme, even the eSafety Commissioner referred to it as a 'sausage that is still being made'. After all this time, a number of stakeholders are concerned that the government introduced the bill into parliament on 24 February 2021, only eight working days after consultation on the exposure draft of the legislation concluded on 14 February 2021. The short time frame at the end of this drawn-out process—delay, delay, delay and rush, rush, rush—has undermined confidence amongst stakeholders in the government's exposure draft consultation process, with a number of people being concerned that submissions have not been considered properly. In evidence to the Senate inquiry into the bills, the department confirmed that 376 submissions on the exposure draft were received by the government and that the government had identified 56 issues that warranted further consideration by the minister as a result but that only seven amendments, of a technical nature, were made to the bill as a result of that consideration.

Given the significant passage of time since the Briggs review reported, it's disappointing that the government has failed to provide a process that satisfies stakeholders that their concerns are being taken on board. There are a wide range of stakeholders who still have valid concerns with this bill. As the report of the Senate inquiry into the bills notes:

Overwhelmingly, submitters and witnesses expressed support for the objectives and many provisions of the OS Bill … Most submitters appreciated the public consultations conducted by the department but raised concerns about matters that they argued have not been addressed … and which some considered long overdue for reform.

Some of the concerns noted in the inquiry report are in relation to:

                I should note that there are a number of other policy review processes in train from the government that have not been concluded but have a very significant and consequential impact on the operation of this bill. Just to take a few currently underway, we have the review of the Australian classification regulation, the results of which will have very significant implications for the online content scheme; a review of the Privacy Act 1988, which may well consider the implications of some of the deanonymisation provisions of this bill and its interaction with GDPR obligations that are imposed on Australian companies; the government response to the report of the Australian Taskforce to Combat Terrorist and Extreme Violent Material Online; and government responses to related parliamentary inquiries into online safety. We don't know either what those reports say or what the government's response to them will be, but they will have a significant consequence for the operation of this bill. This process has not been managed well in a policy development sense.

                I'll give you a flavour of the concerns that stakeholders still have with these bills. Google, in its submission in response to the fact that the schemes apply to a wide range of services, including messaging services and email, suggested:

                … the scope of the Bill be limited in scope to content sharing services, like social media and video sharing services, which have the principal purpose of helping people to store and share content with the public or other broad audiences …

                Sex Work Law Reform Victoria stated in its submission:

                … Part 9 of The Bill provides the eSafety Commissioner with enhanced powers to issue removal notices to take down BDSM and fetish porn with 24 hours, similar powers had not been used to target such content in the past and were not intended to be used to target such content in the future.

                It remains unclear why Part 9 has been drafted in a manner to give the eSafety Commissioner powers they don't intend to use.

                Twitch stated in its submission:

                … any scheme that justifies mandating the complete removal of a service on the basis of its non-compliance with notices should also take considerable steps to establish confidence that the service is demonstrating actual non-compliance …

                This concern has been raised by other large platforms who are concerned they could also be shut down overnight, despite their best efforts to comply with the law. Communications Alliance stated in its submission:

                … messaging services (e.g. WhatsApp, Signal, Telegram) are often end-to-end encrypted and may not offer an option for removal of individual parts of a conversation. Does this mean that user accounts would be required to be suspended, restricted or terminated when a complaint (that has been found valid) about cyber-abuse material has been received? It is not clear that wholesale suspension from a messaging service is a proportionate response to a report of bullying and harassment – especially given how nuanced and complex private conversations between adults can be.

                …   …   …

                Importantly, how would the eSafety Commissioner determine, in the context of a private communication between two individuals, whether a certain behaviour constitutes cyber abuse, without extensive knowledge of the context and background of that communication?

                …   …   …

                It is worth noting that the German Netzwerkdurchsetzungsgesetz (NetzDG)[Network Enforcement Act] has refrained from including private messaging services in its scope.

                Finally, Electronic Frontiers Australia stated in its submission:

                EFA recommends that the eSafety Commissioner should regularly report on the outcomes achieved as a result of its actions, rather than merely the activities it has performed. Activity is not the same as effectiveness. These results should be specific, tangible, and beneficial. Any adverse effects experienced as a result of the eSafety Commissioner's actions should also be contained in such reports, so that Australians can accurately determine if the costs of the eSafety Commissioner's actions outweigh the benefits.

                Other significant issues of concern include how the bill interacts with the existing framework of safeguards put in place by the telecommunications assistance and access regime as well as other matters that interact with freedom of speech.

                Finding a balance between freedom of speech and protections against certain kinds of harmful speech is a complex endeavour, and we are concerned that this bill represents a significant increase in the eSafety Commissioner's discretion to remove material without commensurate checks and balances. Over the years, dominant digital platforms have developed robust policies to remove inappropriate content, including abusive content, but to ensure legitimate political speech remains. The question, in relation to the proposed adult cyberbullying regime, is: where will the eSafety Commissioner draw the line?

                During the Senate inquiry hearing, questions were put to the government about where and how the line will be drawn, but there is a real lack of conceptual and operational clarity around these elements. The sausage is still being made, it seems. It doesn't take much imagination to foresee a situation where, in the hands of an overzealous eSafety Commissioner, legitimate speech could be silenced, whether that of racial or religious minorities expressing outrage at racist speech or of women expressing outrage at sexual violence in the workplace. The line between adult cyberbullying and defamation and legitimate speech, for example, may not always be clear and may at times be the subject of intense political pressure. It's possible that notices could be used to silence important speech, be they opinions or allegations—for example, if they are considered to be offensive and harassing. Without strong procedural fairness and oversight in law, there's not much to prevent that from happening. The avenues for appeal may be difficult to access.

                Whilst supportive of a scheme for adult cyberabuse, Labor finds it curious that the government has made repeated attempts to repeal section 18C of the Racial Discrimination Act on the grounds that it unduly restricts freedom of speech, despite the availability of defences in section 18D. It's now seeking to rush through a bill that empowers the eSafety Commissioner with discretion to determine matters of speech in relation to adult cyberbullying without greater checks, balances or operational clarity.

                It is worth recalling what some members of this government have had to say about section 18C and freedom of expression in the past. To quote Senator Hume's speech in the second reading debate:

                … what has become apparent is that the Racial Discrimination Act is a law that gave practical effect to making it unlawful to hurt people's feelings … this protection—

                of the International Convention on the Elimination of All Forms of Racial Discrimination

                needs to be consistent with the right of freedom of speech, which is the cornerstone of a strong and healthy liberal democracy.

                …   …   …

                Section 18C has been the only law in Australia and perhaps the only law in the world by which liability is determined exclusively from the standpoint of would-be victims.

                …   …   …

                Clearly, section 18C is stymieing legitimate debate that is potentially useful and healthy.

                To quote Senator James McGrath's speech:

                How we see 18C operate is a classic demonstration of the consequences of the government back in 1995 bringing in changes to the Racial Discrimination Act that have had the effect of shutting down freedom of speech.

                …    …    …

                It is offensive on so many levels that the left are using freedom of speech to effectively criminalise those people with whom they disagree. This is the challenge that is facing Australia at the moment.

                Or to quote James Paterson's speech: 'The way to beat racism is through debate, not the closing down of debate.'

                I will be interested to see those senators' speeches on this bill. Indeed, I'm interested in seeing the speeches of the members who have brought supposedly pro-free-speech private members' bills into this chamber. This government talks a big game about its 'expectations' of social media platforms, yet it has failed to do its job by updating Australia's online safety laws. While the government is right to expect digital platforms to offer more in terms of transparency, so too must government be prepared to provide transparency around its decision-making, particularly on important matters that engage with human rights. Labor believes that the government must consider further amendments to clarify this bill in terms of its scope and to strengthen due-process appeals, oversight and transparency requirements, given the important free-speech and digital-rights considerations that it engages. Online safety is of increasing importance to Australians as we spend more time online. That is why Labor will work constructively with the government to iron out concerns with this bill and to improve these laws in the interests of safeguarding the online safety of Australians.

                This bill is something of a missed opportunity, and there's more work to be done when it comes to keeping Australians safe online. While the bill gives a lot of power to the eSafety Commissioner, it doesn't empower the commissioner to respond to hate speech that targets groups. This is a bill that deals with speech that targets individuals. The eSafety Commissioner's own research found that one in seven Australian adults had been the victims of hate speech in the 12 months up to August 2019. Despite these shockingly high rates of hate speech, under this bill, hate speech can only be addressed by the eSafety Commissioner if the speech is also considered a form of bullying that meets the threshold of the adult cyberbullying scheme, for which the bar is intended to be high. We need to make the online environment safe for everyone. The negative outcomes we're trying to avoid by combatting individual bullying can occur just as easily when an individual is targeted as part of a minority group. Teenagers being targeted because of their sexuality are harmed by broad statements that vilify them as part of the broader group, as well as by bullies that target them as individuals.

                The failure to properly address the same harm in both forms is a serious flaw in this bill. This is particularly an issue when it comes to material that can radicalise people to commit violent acts. Yesterday we marked two years since the Christchurch massacre, two years since one of our own murdered 51 Kiwis at two mosques. We failed to have a serious conversation about the extent to which the Christchurch terrorist was radicalised on our shores by talking to Australians online. One of the things we did in the wake of the horrific event was to sign up to the Christchurch Call, led by the New Zealand government, which encourages governments and platforms to work together to combat violent extremist content. There is a broader conversation that we need to have in our society to address the online speech that leads to violent extremism. We should take the same approach as the Christchurch Call in this bill with respect to hate speech. As the Online Hate Prevention Institute stated in their submission:

                Bullying a child or an adult as an individual are covered, however, attacks on a group that includes that child or adult are not. For the most vulnerable, including those at elevated risk of suicide or self-harm due to online harassment, being targeted as part of a group is just as harmful and such content is as much a threat to online safety. Some of this content is the kind of hate speech which feeds into radicalization.

                And as Reset Australia, in their submission, acknowledge:

                … that dangerous, violent and hateful material must be taken down, and that an independent regulator must be empowered to do so …

                Labor has a strong track record when it comes to promoting online safety. Labor is deeply committed to keeping Australians safe online. In 2008, the Labor government delivered $125.8 million towards a cybersafety plan to combat online risks to children and to help parents and educators protect children from inappropriate material and contacts when online. In 2010, the Labor government established the Joint Select Committee on Cyber-Safety as part of its commitment to investigate and improve cybersafety measures, releasing a report with 32 recommendations, each of which was endorsed and responded to by the Labor government. We've taken this issue seriously both in government and in opposition. Since 2013, Labor has supported government e-safety and online wagering initiatives in parliament, and the government has acknowledged the strong bipartisan support of the opposition in this area.

                Labor is proud to have led calls for the criminalisation of the non-consensual sharing of intimate images. In October 2015, I was proud to introduce a private member's bill, which was seconded by the member for Griffith, to criminalise the sharing of private sexual material without consent. Shortly after that, the Senate Legal and Constitutional Affairs References Committee established an inquiry into this very serious issue. Finally, in 2018, this became law as part of a government bill. Further, Labor senators supported the recommendations of the Senate Environment and Communications References Committee inquiry into harm being done to Australian children through access to pornography on the internet, which reported in November 2016, and the inquiry into gaming microtransactions for chance-based items, or 'loot boxes', in November 2018. Similarly, Labor members supported the recommendations of the House of Representatives Standing Committee on Social Policy and Legal Affairs inquiry into age verification for online wagering and online pornography, which reported in February 2020, and I note the important additional comments of Labor members on that report. As Labor has done before, we note that the government has the benefit of a report of an expert working group, convened by the eSafety Commissioner and participated in by industry, which remains cabinet in confidence. Labor strongly encourages the government to reclassify the report and make it public so that the broad range of stakeholders supportive of online safety may have the benefit of the work in this area that has bipartisan support, so that we may all work together to keep Australians safe online.

                Australia has long recognised the internet as a governed space. Indeed, Australian governments have sought to regulate it. Twenty-five years ago, the Australian Parliamentary Library published a research paper entitled Can the Internet be regulated? Among others things, that paper noted that legislation that was considered by the Australian states and territories provided an incentive for establishing a code of conduct, and the then Australian Broadcasting Authority announced an inquiry into the regulation of online content services proposing the exploration of various strategies, including codes of practice, complaints procedures and education programs, in addition to devices for blocking or filtering certain materials, and offence provisions. Following that, early legislative reforms directed at regulation of the internet in Australia included amendments in 1999 to the Broadcasting Services Act which established a regulatory regime for internet service providers and online content. Other early legislative measures included the Cybercrime Act 2001 and the Spam Act 2003. In addition, the Office of the eSafety Commissioner was created a few years ago. In recent years, the government has stated that it is important to recognise that the internet is not 'the Wild West, where the rule of law and standards of decency shouldn't apply'. That is true. However, the question has long been not whether to regulate the internet but how best to regulate the internet? That, in many respects, is an unanswered question thanks to the process of mismanagement in bringing this bill before the House.

                The year 2021 has borne witness to some very significant developments in respect to the relationship of government with the internet. The deplatforming of US President Donald Trump was a particular flashpoint in this respect. Last year, many parents and carers received an urgent message from their children's school—I know I did—warning of a particular suicide video that was circulating on social media, which children were being exposed to. An ABC Four Corners investigation brought to light concerning reports of sexual assault in Australia facilitated by dating apps like Tinder. Further afield, public interest reporting around consent and content on the Canadian pornography website Pornhub forced that platform—seemingly overnight, with the flick of a switch—to clean up its act around what videos are uploaded and requiring verification of users' content. And, in 2019, Australia responded to the Christchurch terrorist atrocity with a law that put some of the practical challenges of internet governance back onto the platforms. For too long, governments and regulators around the world had seen all manner of crimes live streamed yet had failed to appropriately act. It is clear that we must be less reactive and more responsive when it comes to online safety. It should be reflected in a modernised online safety act as well as in the content of that act. It is essential that elected representatives set norms through dialogue, debate and a bipartisan commitment to keeping Australians safe online, as well as through well-crafted and carefully balanced laws.

                Labor supports steps to improve the online safety of Australians and to modernise regulatory frameworks. This is not a set-and-forget task. There is much more to do. Importantly, it is essential that law to improve the online safety of Australians is well crafted and well balanced when it's engaged with human rights issues. We acknowledge the efforts of government and industry and of civil society stakeholders who, in furtherance of this goal of enhancing the online safety of Australians, have made submissions to the process that the government has managed in bringing this bill before the House. Labor supports the objectives of this bill. We also acknowledge key and important concerns that remain outstanding. On that note, I move:

                That all words after "That" be omitted with a view to substituting the following words:

                "whilst not declining to give the bill a second reading, the House notes:

                (1) it has been almost two and a half years since the 2018 independent review by Lynelle Briggs AO recommended a new Online Safety Act;

                (2) given the passage of time, it is disappointing that the Government has proved incapable of conducting a process that satisfies stakeholders on the range of serious and important issues the bill engages with;

                (3) submissions on the exposure draft of the legislation as well as to the Senate committee inquiry into the bill identified a range of concerns which remain outstanding;

                (4) the Department of Infrastructure, Transport, Regional Development and Communications advised the Senate inquiry that it identified 56 issues that warranted further consideration by the Minister following consultation on the exposure draft, but only seven amendments of a technical nature were made to the bill as a result of that consideration; and

                (5) further Government amendments to the bill are anticipated to address concerns with the bill and should be shared for consideration ahead of debate in the Senate."

                Comments

                No comments