House debates

Tuesday, 16 March 2021


Online Safety Bill 2021, Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021; Second Reading

12:04 pm

Photo of Tim WattsTim Watts (Gellibrand, Australian Labor Party, Shadow Assistant Minister for Communications and Cyber Security) Share this | | Hansard source

I rise to speak on the Online Safety Bill 2021 and the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021. Together, they seek to create a new online safety framework for Australians, 'a modern, fit-for-purpose regulatory framework that builds on the strengths of the existing legislative scheme for online safety'. Labor have a strong record, in government and in opposition, of supporting online safety measures for Australians, and we support the intent and the approach of these bills. From opposition we've sought to take a constructive approach to these bills. We support the measures in these bills to consolidate, update and enhance online safety laws for Australians.

For many years, Australians have been protected by laws to support online safety, and it's important that these laws are kept up to date, particularly in a sector as fast-moving as the technology and internet sector. To a large degree, these bills represent regulatory housekeeping by consolidating various online safety laws that have evolved over time into one bill, with some updates in response to Lynelle Briggs's independent review of online safety laws, which reported to the government in October 2018. As the report of the Briggs review notes:

Australia was one of the first countries to appreciate the threat to child safety afforded by the internet through excellent early work by then Australian Broadcasting Authority (ABA) which led to amendments to the Broadcasting Services Act in 1999 that established the legislative framework for online content co-regulation in Australia.

The upshot was to extend the co-regulatory system for broadcasting to online content, with Australian content assessed and taken down if found to be non-compliant with national classification requirements.

…   …   …

The sorts of prohibited content the co-regulatory system seeks to constrain are illegal material such as child sexual abuse material, extremely violent and disturbing pornography, extremist propaganda, incitement to terrorism, and games that victimise and abuse children or encourage illegal activity. It also seeks to restrict access to content that may be suitable for adults, but not children…

This is the overarching framework that has been in place for some decades now.

Over 20 years later, the laws still sit under the Broadcasting Services Act, which is an act for the regulation of broadcasting services—television and radio services—rather than online safety. However, as the Briggs review also notes:

The Enhancing Online Safety for Children Act was introduced in 2015 and renamed the Enhancing Online Safety Act in 2017 when its coverage was extended to certain adults … experiencing image-based abuse.

That act sets the arrangements for the e-safety office and allows the eSafety Commissioner to administer the cyberbullying complaints scheme, which investigates serious cyberbullying of children; and the image based abuse scheme, which provides a reporting investigation mechanism for the non-consensual sharing of intimate images.

Labor supports the aim of consolidating online safety laws into a new framework. This bill retains and replicates the existing provisions of the Enhancing Online Safety Act 2015 that are working well to protect Australians from online harms, such as the non-consensual sharing of intimate images scheme. These bills reflect a modernised online content scheme to replace the schemes that are currently in schedules 5 and 7 of the Broadcasting Services Act to address harmful content like refused-classification material.

These bills update various other elements, including setting new regulatory benchmarks—for example, by broadening the cyberbullying scheme to capture harms occurring on services other than social media; reducing the time frame for service providers to respond to a removal notice from the eSafety Commissioner, from 48 hours to 24 hours; bringing providers of app distribution services—app stores and internet search engine services—clearly into the remit of the new online content scheme; and establishing a specific and targeted power for the eSafety Commissioner to request or require ISPs to disable access to material depicting, promoting, inciting or instructing abhorrent violent conduct for time limited periods in crisis situations, reflecting the industry's call for government leadership on this issue.

Labor also supports the new elements of these bills that go beyond regulatory housekeeping, including the creation of a complaints-based removal notice scheme for cyberabuse being perpetrated against an Australian adult; and the articulation of a core set of basic online safety expectations to improve and promote online safety for Australians.

However, we are concerned about a number of aspects of these bills, and we have sought to be constructive in the way that we have expressed these concerns. Firstly, there is the government's delay and mismanagement of the process of getting a bill for a new online safety act before the parliament here today, which has substantive consequences. Secondly, there is the government's inability, after all of this time, to address key stakeholder concerns about serious, important and legitimate issues enlivened by these bills.

The safety of Australians online is of real importance, and Labor will work with the government to iron out these concerns in these bills in time for the debate on this bill in the Senate. But, in the meantime, Labor will not oppose these bills in the House of Representatives, and we will support passage through this place on the understanding that government amendments will be forthcoming. We have been in good-faith conversations with the government, and we expect those good-faith conversations to result in further changes.

I just want to articulate a little bit about the delay and mismanagement in the process that has brought the bill to the House today. The idea of an online safety bill has been around for 2½ years, since the Briggs review in October 2018 recommended a single up-to-date online safety act. Now the government has gotten around to introducing a bill to address this recommendation. But for almost two years the government has been spruiking this new online safety act in press conferences and media announcements as if it already existed. It's been a bit of a catch-all solution. In the lead-up to the May 2019 federal election, the Morrison government promised to introduce a new online safety act. In September 2019, the minister for communications spruiked the new online safety act in answer to questions about what the government was doing to keep Australians safe online, including in relation to the rise of right-wing extremism, online hate speech and racism in Australia following the Christchurch terrorist atrocity. A year later, in September 2020, the minister again spruiked the still non-existent online safety act in response to questions about what the government was doing to curb graphic content on social media platforms in the wake of a self-harm video on Facebook and TikTok. The minister's October 2020 op-ed on this topic kept the promise of a new online safety act alive, while Labor had been asking where it was, with the department at Senate estimates last year putting the delay down to 'pressures on drafting resources'. That's legislative drafting, not media release drafting. There seems to be an unlimited supply of that in the Morrison government.

After all this time, the government still doesn't quite have it ready to go. During the Senate inquiry into the provisions of this bill, when asked about key operational aspects of this bill, such as the novel adult cyberabuse scheme, even the eSafety Commissioner referred to it as a 'sausage that is still being made'. After all this time, a number of stakeholders are concerned that the government introduced the bill into parliament on 24 February 2021, only eight working days after consultation on the exposure draft of the legislation concluded on 14 February 2021. The short time frame at the end of this drawn-out process—delay, delay, delay and rush, rush, rush—has undermined confidence amongst stakeholders in the government's exposure draft consultation process, with a number of people being concerned that submissions have not been considered properly. In evidence to the Senate inquiry into the bills, the department confirmed that 376 submissions on the exposure draft were received by the government and that the government had identified 56 issues that warranted further consideration by the minister as a result but that only seven amendments, of a technical nature, were made to the bill as a result of that consideration.

Given the significant passage of time since the Briggs review reported, it's disappointing that the government has failed to provide a process that satisfies stakeholders that their concerns are being taken on board. There are a wide range of stakeholders who still have valid concerns with this bill. As the report of the Senate inquiry into the bills notes:

Overwhelmingly, submitters and witnesses expressed support for the objectives and many provisions of the OS Bill … Most submitters appreciated the public consultations conducted by the department but raised concerns about matters that they argued have not been addressed … and which some considered long overdue for reform.

Some of the concerns noted in the inquiry report are in relation to:

                I should note that there are a number of other policy review processes in train from the government that have not been concluded but have a very significant and consequential impact on the operation of this bill. Just to take a few currently underway, we have the review of the Australian classification regulation, the results of which will have very significant implications for the online content scheme; a review of the Privacy Act 1988, which may well consider the implications of some of the deanonymisation provisions of this bill and its interaction with GDPR obligations that are imposed on Australian companies; the government response to the report of the Australian Taskforce to Combat Terrorist and Extreme Violent Material Online; and government responses to related parliamentary inquiries into online safety. We don't know either what those reports say or what the government's response to them will be, but they will have a significant consequence for the operation of this bill. This process has not been managed well in a policy development sense.

                I'll give you a flavour of the concerns that stakeholders still have with these bills. Google, in its submission in response to the fact that the schemes apply to a wide range of services, including messaging services and email, suggested:

                … the scope of the Bill be limited in scope to content sharing services, like social media and video sharing services, which have the principal purpose of helping people to store and share content with the public or other broad audiences …

                Sex Work Law Reform Victoria stated in its submission:

                … Part 9 of The Bill provides the eSafety Commissioner with enhanced powers to issue removal notices to take down BDSM and fetish porn with 24 hours, similar powers had not been used to target such content in the past and were not intended to be used to target such content in the future.

                It remains unclear why Part 9 has been drafted in a manner to give the eSafety Commissioner powers they don't intend to use.

                Twitch stated in its submission:

                … any scheme that justifies mandating the complete removal of a service on the basis of its non-compliance with notices should also take considerable steps to establish confidence that the service is demonstrating actual non-compliance …

                This concern has been raised by other large platforms who are concerned they could also be shut down overnight, despite their best efforts to comply with the law. Communications Alliance stated in its submission:

                … messaging services (e.g. WhatsApp, Signal, Telegram) are often end-to-end encrypted and may not offer an option for removal of individual parts of a conversation. Does this mean that user accounts would be required to be suspended, restricted or terminated when a complaint (that has been found valid) about cyber-abuse material has been received? It is not clear that wholesale suspension from a messaging service is a proportionate response to a report of bullying and harassment – especially given how nuanced and complex private conversations between adults can be.

                …   …   …

                Importantly, how would the eSafety Commissioner determine, in the context of a private communication between two individuals, whether a certain behaviour constitutes cyber abuse, without extensive knowledge of the context and background of that communication?

                …   …   …

                It is worth noting that the German Netzwerkdurchsetzungsgesetz (NetzDG)[Network Enforcement Act] has refrained from including private messaging services in its scope.

                Finally, Electronic Frontiers Australia stated in its submission:

                EFA recommends that the eSafety Commissioner should regularly report on the outcomes achieved as a result of its actions, rather than merely the activities it has performed. Activity is not the same as effectiveness. These results should be specific, tangible, and beneficial. Any adverse effects experienced as a result of the eSafety Commissioner's actions should also be contained in such reports, so that Australians can accurately determine if the costs of the eSafety Commissioner's actions outweigh the benefits.

                Other significant issues of concern include how the bill interacts with the existing framework of safeguards put in place by the telecommunications assistance and access regime as well as other matters that interact with freedom of speech.

                Finding a balance between freedom of speech and protections against certain kinds of harmful speech is a complex endeavour, and we are concerned that this bill represents a significant increase in the eSafety Commissioner's discretion to remove material without commensurate checks and balances. Over the years, dominant digital platforms have developed robust policies to remove inappropriate content, including abusive content, but to ensure legitimate political speech remains. The question, in relation to the proposed adult cyberbullying regime, is: where will the eSafety Commissioner draw the line?

                During the Senate inquiry hearing, questions were put to the government about where and how the line will be drawn, but there is a real lack of conceptual and operational clarity around these elements. The sausage is still being made, it seems. It doesn't take much imagination to foresee a situation where, in the hands of an overzealous eSafety Commissioner, legitimate speech could be silenced, whether that of racial or religious minorities expressing outrage at racist speech or of women expressing outrage at sexual violence in the workplace. The line between adult cyberbullying and defamation and legitimate speech, for example, may not always be clear and may at times be the subject of intense political pressure. It's possible that notices could be used to silence important speech, be they opinions or allegations—for example, if they are considered to be offensive and harassing. Without strong procedural fairness and oversight in law, there's not much to prevent that from happening. The avenues for appeal may be difficult to access.

                Whilst supportive of a scheme for adult cyberabuse, Labor finds it curious that the government has made repeated attempts to repeal section 18C of the Racial Discrimination Act on the grounds that it unduly restricts freedom of speech, despite the availability of defences in section 18D. It's now seeking to rush through a bill that empowers the eSafety Commissioner with discretion to determine matters of speech in relation to adult cyberbullying without greater checks, balances or operational clarity.

                It is worth recalling what some members of this government have had to say about section 18C and freedom of expression in the past. To quote Senator Hume's speech in the second reading debate:

                … what has become apparent is that the Racial Discrimination Act is a law that gave practical effect to making it unlawful to hurt people's feelings … this protection—

                of the International Convention on the Elimination of All Forms of Racial Discrimination

                needs to be consistent with the right of freedom of speech, which is the cornerstone of a strong and healthy liberal democracy.

                …   …   …

                Section 18C has been the only law in Australia and perhaps the only law in the world by which liability is determined exclusively from the standpoint of would-be victims.

                …   …   …

                Clearly, section 18C is stymieing legitimate debate that is potentially useful and healthy.

                To quote Senator James McGrath's speech:

                How we see 18C operate is a classic demonstration of the consequences of the government back in 1995 bringing in changes to the Racial Discrimination Act that have had the effect of shutting down freedom of speech.

                …    …    …

                It is offensive on so many levels that the left are using freedom of speech to effectively criminalise those people with whom they disagree. This is the challenge that is facing Australia at the moment.

                Or to quote James Paterson's speech: 'The way to beat racism is through debate, not the closing down of debate.'

                I will be interested to see those senators' speeches on this bill. Indeed, I'm interested in seeing the speeches of the members who have brought supposedly pro-free-speech private members' bills into this chamber. This government talks a big game about its 'expectations' of social media platforms, yet it has failed to do its job by updating Australia's online safety laws. While the government is right to expect digital platforms to offer more in terms of transparency, so too must government be prepared to provide transparency around its decision-making, particularly on important matters that engage with human rights. Labor believes that the government must consider further amendments to clarify this bill in terms of its scope and to strengthen due-process appeals, oversight and transparency requirements, given the important free-speech and digital-rights considerations that it engages. Online safety is of increasing importance to Australians as we spend more time online. That is why Labor will work constructively with the government to iron out concerns with this bill and to improve these laws in the interests of safeguarding the online safety of Australians.

                This bill is something of a missed opportunity, and there's more work to be done when it comes to keeping Australians safe online. While the bill gives a lot of power to the eSafety Commissioner, it doesn't empower the commissioner to respond to hate speech that targets groups. This is a bill that deals with speech that targets individuals. The eSafety Commissioner's own research found that one in seven Australian adults had been the victims of hate speech in the 12 months up to August 2019. Despite these shockingly high rates of hate speech, under this bill, hate speech can only be addressed by the eSafety Commissioner if the speech is also considered a form of bullying that meets the threshold of the adult cyberbullying scheme, for which the bar is intended to be high. We need to make the online environment safe for everyone. The negative outcomes we're trying to avoid by combatting individual bullying can occur just as easily when an individual is targeted as part of a minority group. Teenagers being targeted because of their sexuality are harmed by broad statements that vilify them as part of the broader group, as well as by bullies that target them as individuals.

                The failure to properly address the same harm in both forms is a serious flaw in this bill. This is particularly an issue when it comes to material that can radicalise people to commit violent acts. Yesterday we marked two years since the Christchurch massacre, two years since one of our own murdered 51 Kiwis at two mosques. We failed to have a serious conversation about the extent to which the Christchurch terrorist was radicalised on our shores by talking to Australians online. One of the things we did in the wake of the horrific event was to sign up to the Christchurch Call, led by the New Zealand government, which encourages governments and platforms to work together to combat violent extremist content. There is a broader conversation that we need to have in our society to address the online speech that leads to violent extremism. We should take the same approach as the Christchurch Call in this bill with respect to hate speech. As the Online Hate Prevention Institute stated in their submission:

                Bullying a child or an adult as an individual are covered, however, attacks on a group that includes that child or adult are not. For the most vulnerable, including those at elevated risk of suicide or self-harm due to online harassment, being targeted as part of a group is just as harmful and such content is as much a threat to online safety. Some of this content is the kind of hate speech which feeds into radicalization.

                And as Reset Australia, in their submission, acknowledge:

                … that dangerous, violent and hateful material must be taken down, and that an independent regulator must be empowered to do so …

                Labor has a strong track record when it comes to promoting online safety. Labor is deeply committed to keeping Australians safe online. In 2008, the Labor government delivered $125.8 million towards a cybersafety plan to combat online risks to children and to help parents and educators protect children from inappropriate material and contacts when online. In 2010, the Labor government established the Joint Select Committee on Cyber-Safety as part of its commitment to investigate and improve cybersafety measures, releasing a report with 32 recommendations, each of which was endorsed and responded to by the Labor government. We've taken this issue seriously both in government and in opposition. Since 2013, Labor has supported government e-safety and online wagering initiatives in parliament, and the government has acknowledged the strong bipartisan support of the opposition in this area.

                Labor is proud to have led calls for the criminalisation of the non-consensual sharing of intimate images. In October 2015, I was proud to introduce a private member's bill, which was seconded by the member for Griffith, to criminalise the sharing of private sexual material without consent. Shortly after that, the Senate Legal and Constitutional Affairs References Committee established an inquiry into this very serious issue. Finally, in 2018, this became law as part of a government bill. Further, Labor senators supported the recommendations of the Senate Environment and Communications References Committee inquiry into harm being done to Australian children through access to pornography on the internet, which reported in November 2016, and the inquiry into gaming microtransactions for chance-based items, or 'loot boxes', in November 2018. Similarly, Labor members supported the recommendations of the House of Representatives Standing Committee on Social Policy and Legal Affairs inquiry into age verification for online wagering and online pornography, which reported in February 2020, and I note the important additional comments of Labor members on that report. As Labor has done before, we note that the government has the benefit of a report of an expert working group, convened by the eSafety Commissioner and participated in by industry, which remains cabinet in confidence. Labor strongly encourages the government to reclassify the report and make it public so that the broad range of stakeholders supportive of online safety may have the benefit of the work in this area that has bipartisan support, so that we may all work together to keep Australians safe online.

                Australia has long recognised the internet as a governed space. Indeed, Australian governments have sought to regulate it. Twenty-five years ago, the Australian Parliamentary Library published a research paper entitled Can the Internet be regulated? Among others things, that paper noted that legislation that was considered by the Australian states and territories provided an incentive for establishing a code of conduct, and the then Australian Broadcasting Authority announced an inquiry into the regulation of online content services proposing the exploration of various strategies, including codes of practice, complaints procedures and education programs, in addition to devices for blocking or filtering certain materials, and offence provisions. Following that, early legislative reforms directed at regulation of the internet in Australia included amendments in 1999 to the Broadcasting Services Act which established a regulatory regime for internet service providers and online content. Other early legislative measures included the Cybercrime Act 2001 and the Spam Act 2003. In addition, the Office of the eSafety Commissioner was created a few years ago. In recent years, the government has stated that it is important to recognise that the internet is not 'the Wild West, where the rule of law and standards of decency shouldn't apply'. That is true. However, the question has long been not whether to regulate the internet but how best to regulate the internet? That, in many respects, is an unanswered question thanks to the process of mismanagement in bringing this bill before the House.

                The year 2021 has borne witness to some very significant developments in respect to the relationship of government with the internet. The deplatforming of US President Donald Trump was a particular flashpoint in this respect. Last year, many parents and carers received an urgent message from their children's school—I know I did—warning of a particular suicide video that was circulating on social media, which children were being exposed to. An ABC Four Corners investigation brought to light concerning reports of sexual assault in Australia facilitated by dating apps like Tinder. Further afield, public interest reporting around consent and content on the Canadian pornography website Pornhub forced that platform—seemingly overnight, with the flick of a switch—to clean up its act around what videos are uploaded and requiring verification of users' content. And, in 2019, Australia responded to the Christchurch terrorist atrocity with a law that put some of the practical challenges of internet governance back onto the platforms. For too long, governments and regulators around the world had seen all manner of crimes live streamed yet had failed to appropriately act. It is clear that we must be less reactive and more responsive when it comes to online safety. It should be reflected in a modernised online safety act as well as in the content of that act. It is essential that elected representatives set norms through dialogue, debate and a bipartisan commitment to keeping Australians safe online, as well as through well-crafted and carefully balanced laws.

                Labor supports steps to improve the online safety of Australians and to modernise regulatory frameworks. This is not a set-and-forget task. There is much more to do. Importantly, it is essential that law to improve the online safety of Australians is well crafted and well balanced when it's engaged with human rights issues. We acknowledge the efforts of government and industry and of civil society stakeholders who, in furtherance of this goal of enhancing the online safety of Australians, have made submissions to the process that the government has managed in bringing this bill before the House. Labor supports the objectives of this bill. We also acknowledge key and important concerns that remain outstanding. On that note, I move:

                That all words after "That" be omitted with a view to substituting the following words:

                "whilst not declining to give the bill a second reading, the House notes:

                (1) it has been almost two and a half years since the 2018 independent review by Lynelle Briggs AO recommended a new Online Safety Act;

                (2) given the passage of time, it is disappointing that the Government has proved incapable of conducting a process that satisfies stakeholders on the range of serious and important issues the bill engages with;

                (3) submissions on the exposure draft of the legislation as well as to the Senate committee inquiry into the bill identified a range of concerns which remain outstanding;

                (4) the Department of Infrastructure, Transport, Regional Development and Communications advised the Senate inquiry that it identified 56 issues that warranted further consideration by the Minister following consultation on the exposure draft, but only seven amendments of a technical nature were made to the bill as a result of that consideration; and

                (5) further Government amendments to the bill are anticipated to address concerns with the bill and should be shared for consideration ahead of debate in the Senate."

                Photo of Rob MitchellRob Mitchell (McEwen, Australian Labor Party) Share this | | Hansard source

                Is the amendment seconded?

                Photo of Michelle RowlandMichelle Rowland (Greenway, Australian Labor Party, Shadow Minister for Communications) Share this | | Hansard source

                I second the amendment and reserve my right to speak.

                Photo of Rob MitchellRob Mitchell (McEwen, Australian Labor Party) Share this | | Hansard source

                The original question was that this bill be now read a second time. To this, the honourable member for Gellibrand has moved as an amendment that all words after 'That' be omitted with a view to substituting other words. If it suits the House, I will state the question in the form that the words proposed to be omitted stand part of the question.

                12:32 pm

                Photo of Katie AllenKatie Allen (Higgins, Liberal Party) Share this | | Hansard source

                I rise today to support the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021 not only as the member for Higgins but as a concerned mother and former paediatrician. The last decade has seen the internet become an increasingly crucial centrepiece in all of our lives. It's brought us together for work, to learn and to connect more than ever. In this light, it is now critical, more than ever, that we work together to ensure the internet is as safe as possible so that everyone can enjoy the benefits of being connected without fear of the sometimes real risks of online connectivity.

                The internet generally and social media specifically are forms of communication that have been in transition. In some ways the net can be seen as the Wild West, with lots of opportunities—but with opportunities come unregulated challenges. The job of government is to get the balance right in legislation, and that includes putting online safety at the heart of decision-making but enabling opportunities so the world can flourish from the benefits of these online opportunities. This bill seeks to combat the evolving use of the internet in facilitating abusive social interactions, including the distribution of intimate images and cyberabuse, by amending offences in the Criminal Code which are specifically intended to prevent, deter and sanction—as well as educate and draw attention to the criminality of—this conduct.

                Many of us here in this chamber have heard stories about this, either from the news or our constituents, or may even have been affected ourselves. Some of us have fallen victim to some of these horrendous acts. As a parent, I look to the next generation, who are so incredibly reliant on digital connectivity for all things, from social life to study to work. But now it's reached right through the community—everyone from every generation—with more than 90 per cent of Australians connecting every day. When abuse through these services happens, it can seem inescapable for many. Unfortunately, some victims of these heinous crimes take their own life. This is not acceptable. We must do better. And that's what we're doing with this bill. It's not just for the generations here today but for future generations, those that come after us. The Morrison government is determined that the standards and the rule of law that we enjoy in our every day lives, as we walk around our communities, should also apply online. This is not just reasonable and sensible; it's practical. We need to pass this government's Online Safety Bill to ensure this can become a reality.

                The bill will strengthen the ability of the eSafety Commissioner to keep Australians safe when things go wrong online. I would like to take a moment to talk about the eSafety Commissioner, Julie Inman Grant. As the eSafety Commissioner, she is head of the world's first esafety regulatory agency, and she is committed to keeping our citizens safe online. This is very important. This is a world first. No other country, to my knowledge, has legislated an esafety commissioner. Ms Inman Grant spent two decades working in senior public policy and safety roles in the tech industry, at Microsoft, Twitter and Adobe. She's worked in the US Congress and was Global Director for Safety, Privacy Policy and Outreach at Microsoft for many years. She's also global chair of the Child Dignity Alliance's Technical Working Group and a board member of the WePROTECT Global Alliance. This is a woman who really knows her stuff. This bill will provide new powers to the eSafety Commissioner to address emerging harms and hold industry to account for the safety of their products and services.

                The Morrison government is also acting on its commitment to increase penalties for the use of a carriage service to menace, harass or cause offence, from three years up to five years. This will surely help deter many of these acts to begin with. The new Online Safety Bill will establish in law a set of basic online safety expectations for industry. This will help ensure it is transparent and easy to comprehend. In light of the excellent skills and experience of the eSafety Commissioner, these are very relevant changes. Along with these basic expectations, it will also require mandatory transparency reporting. This will empower the eSafety Commissioner to require online services to provide specific information about online harms, such as their response to material depicting abhorrent, violent conduct or even volumetric attacks, which is where organised digital groups seek to overwhelm a victim with abuse. Further, it will see a strengthened cyberbullying scheme for Australian children, along with a new cyberabuse scheme, which will see the removal of the serious forms of online abuse from the internet, backed up by civil penalties. There will also be new requirements for image based abuse content to be removed within 24 hours, and penalties for noncompliance.

                Yesterday marks two years since a shocking event occurred. Australians will not forget—simply cannot forget—what happened in the space of 36 minutes on 15 March 2019, when Brenton Tarrant, a far Right extremist, fatally shot 51 innocent people across two mosques in Christchurch. This was the deadliest terrorist attack in New Zealand's history. What was unique about this horrific terrorist attack was that it was live streamed on Facebook, highlighting the Achilles heel of such platforms when faced with the viral dissemination of extremely violent content. One can't unsee those images. The legislation before us will empower the eSafety Commissioner with a new rapid website-blocking power. This power can be used to block material depicting abhorrent violent conduct, in real time, during an online crisis.

                Enhanced information-gathering powers for the eSafety Commissioner will also help unmask the identities behind anonymous online accounts being used to abuse, bully or humiliate others. This part of the powers I'm sure many Australians and, I suspect, many in this House—particularly public figures—will be celebrating. These online trollings from fake or anonymous accounts can cause untold damage. They really do lower the tone of conversations that can be very empowering in an online space. For far too long and too often many victims of online abuse have been hurled at from behind troll accounts. This act is a warning to all trolls: the tables have been turned; the eSafety Commissioner is coming for you.

                TV and radio presenter Erin Molan was forced to take police action after an online troll bombarded her with a series of vile messages threatening the life of her unborn daughter. I would like to acknowledge Erin and to thank her for sharing her story so publicly and bringing this story to light so that we can understand how terrible some of this trolling can be. Erin Molan is a leading voice for this legislative change and recently said:

                This historical, legislative and social milestone is setting a standard that the rest of the world should—and I think will—follow.

                It is true that our Australian regulations and legislation are being looked to by the rest of the world, and we know this from the point that Mathias Cormann has been recently elected as Secretary-General of the OECD on a platform which included digital taxation, based on the fact that we have a very good record of work in this important area.

                The Online Safety Bill provides a flexible, comprehensive framework that requires platforms to prioritise the safety of Australians and empowers the eSafety Commissioner to assist them when things go wrong online. This bill is one vital step to delivering overall safety online. But it is also true that we, as individuals, need to actively take steps to make the internet a safe and enjoyable space for every user. Together, we can and should work towards a better internet. It is democratised knowledge; it has made the world a better place for many to live. I commend this bill to the House. But we all need to work together with regard to online safety.

                12:42 pm

                Photo of Anne AlyAnne Aly (Cowan, Australian Labor Party) Share this | | Hansard source

                As those who have spoken before me have stated, the Online Safety Bill 2021 seeks to create a new online safety framework for Australians. It is 'a modern, fit-for-purpose regulatory framework that builds on the strengths of the existing legislative scheme for online safety'. Many of the aspects of this bill should be supported. They tackle some very serious—I would venture to say, criminal—activities online and some very serious and damaging behaviours online.

                For the most part, the bill consolidates various online safety laws into one bill and tidies up those laws, but there are a couple of novel elements to the bill. One element is the articulation of a core set of basic online safety expectations, to promote and improve online safety for Australians—certainly something that is very welcomed and supported by those on this side. The other element is the creation of a new complaints based removal notice scheme for cyberabuse being perpetrated against an Australian adult—an adult cyberabuse scheme. I want to talk, in the time I have left, a little on that particular aspect and some concerns that have been raised about it, as well as some concerns that have been raised around the handling of the bill.

                While remaining largely supportive of the intent of the bill and largely supportive of the need for this bill, the criticisms of the bill have been focused on the process and some of the substance issues—things like the functions, powers and oversight of the eSafety Commissioner. I concur with the member for Higgins that the current eSafety Commissioner is a woman of extraordinary talents and brings extraordinary experience to the role. However, that does not preclude an eSafety Commissioner from having the kinds of checks and balances that other people with those kinds of powers should be and are subjected to. There are also concerns about the clarity and breadth of the basic online safety expectations, services in scope of the online content regulation schemes, the clarity and the proportionality of the blocking scheme, the appropriate basis for the online content scheme, reduced response time, and, importantly, the rushed public consultation for this bill, following an exposure draft which was released in December 2020. Only 376 submissions were able to be received, because of the short time of the consultation process.

                But coming back to the adult cyberabuse scheme, which is one of the novel parts of this bill, as I said before—that scheme would enable the eSafety Commissioner to make a determination about what is offensive and have that material removed within a short time frame. Concerns have already been raised about the subjective nature of this determination and the lack of, as I mentioned earlier, checks and balances on determinations, and they're valid concerns about how these determinations and how this particular aspect of the bill would be used.

                I remember that, in my first term here, we had a very heated debate in this House about section 18C of the Racial Discrimination Act. There was fervent debate on the other side, with free-speech warriors claiming that 18C needed to be removed. The member for Higgins, who spoke earlier—I know that she wasn't a member at the time—might like to go back and have a look at some of the Hansard on that debate, because it was members on her side who stated in their debate that offence was not given; offence was taken. It was members on her side who argued quite loudly for the right of people to offend, under the guise of the freedom of speech. I would urge those people—those free-speech warriors—on the other side to look very carefully at the detail of this bill and to look very carefully at the provisions of this bill and to consider how it might impact on free speech and how some of those provisions in the bill might be used.

                The other concern I have is that this provision in the bill is about adult cyberbullying. I have spent a lot of my career working in the spaces of antiracism, and, for some time, I worked in anti-bullying and anti-harassment policy as well. I can tell you that one of the greatest grievances of people who work in that field is that racism often gets subsumed into bullying and it's often dealt with as bullying. That has quite a serious side effect, has a serious consequence, because it means that racism never gets called out as racism. Instead, it's called bullying. But when bullying and the harassment is of a racial nature, it is racism. End of story—full stop. It is racism, and it needs to be called out as racism.

                Imagine a scenario where somebody is trolling with racist remarks and gets called out for it, gets called a racist, and the person, the troll, takes offence to that and reports it and instead of the racist remark being removed, the remarks that are calling out racism get removed instead. That is a very likely scenario—as we currently have it. It's a very likely scenario that somebody who is trolling another individual with racist commentary and gets called out for that racist commentary can claim that they are being bullied and harassed and take action against the person who has called them out. It is a very likely scenario.

                I will remind those on the other side what it looks like to be the subject of racist trolling online, although I'm sure that I don't have to remind some of those on the other side. I know that there are people on the other side who have been trolled online for their political views, for their race or for a number of other things. I've had people comment that I should be hung from a tree, that people should put a bullet in the bag and that they should take me out to the town square and shoot me or behead me. Imagine if, in a scenario like that, I were to call them out as racist and respond with remarks that they are racist, and I then became subject to the adult cyberbullying scheme and I were shut down. That is a very real possibility.

                That said, it is absolutely true that we need a scheme like this that would allow and require platforms to block the kind of terrible content that we saw with the live streaming of the Christchurch terrorist attack. I thank those who have spoken in this place and in the Federation Chamber on the second anniversary of the Christchurch terrorist attack. It is quite timely that this bill comes in line with that anniversary.

                The Parliamentary Joint Committee on Intelligence and Security is currently undertaking an inquiry into violent extremist groups in Australia, and the terms of reference include online content. But I want to make this point: much of the recruiting and influence of violent extremist groups happens in the dark spaces of the internet, on the dark web. Very little of it happens on the surface web. It's just the tip of the iceberg that we see on the surface web. I want to make sure that this scheme captures the kind of online racism that we see on the surface web. People can be racist trolls and not be members of or affiliated with violent extremist far-Right organisations. We cannot let that behaviour slip through the net if we're going to have a comprehensive framework for dealing with these behaviours online.

                I just remind the House that, before I entered parliament, much of my research was on online behaviours, and I also did some research on the most appropriate way of dealing with online offensive material, looking at whether or not the whack-a-mole approach of removing material as it appeared was the best approach and also at the kinds of online behaviours that precede radicalisation and violent acts, such as the behaviours that were observed post the act of the Christchurch terrorist but perhaps should have been observed prior to the act. Many of those behaviours happen in the dark spaces. So I really would like to see further work done on this bill, and particularly on the scheme, to ensure that the adult cyberbullying part of this bill, the adult cyberbullying scheme, also recognises racism as part of it, racial harassment as part of it and hate speech as part of it, and clearly defines them and has clear and defined processes for the reporting and the removal of this kind of material online.

                That said, I join my Labor colleagues in supporting the intent of the bill. I join my Labor colleagues, particularly the member for Gellibrand and the shadow minister, who is sitting here today, in supporting the intent of the bill and the good things that the bill will do in removing really, really harmful content. But also, in adding to the debate today, I add my concerns about the bill—about the way that the bill was handled, about the lack of consultation in the handling of the bill and about the further steps that the bill needs to take to ensure that it actually does provide safety. All of us are entitled to feel safe, whether it's in our homes, whether it's in our workplaces, whether it's in our streets and whether it's online.

                12:56 pm

                Photo of Andrew WallaceAndrew Wallace (Fisher, Liberal Party) Share this | | Hansard source

                According to research from the eSafety Commissioner, last year, 44 per cent of Australian teenagers had a negative online experience in the six months to September 2020. I want to briefly acknowledge the work that's been done in this space by the member for Forrest, who's just walked into the chamber. She's been an undying advocate for this area. It's been great to work alongside her as a bit of a tag team in this space, because there's a lot of work to be done. Well done for everything you've done.

                Tragically, we saw a 129 per cent increase in child sexual abuse material reports over the March to September period last year. But it's not just children and young people who have serious negative experiences online; in fact, for adults, the issue is even more prevalent. The eSafety Commissioner found that 67 per cent of Australian adults had a negative experience online in the 12 months to August 2019, including repeated unwanted messages, unwanted pornography or violent content, scams, viruses, hate speech, abuse and threats. Vulnerable Australians, including those with a disability, are more likely than others to have these sorts of terrible experiences. But perhaps most disturbingly as many as a quarter of those who have had an a negative online experience experienced mental or emotional distress as a result.

                In one indicator of the size of this problem, online harassment and cyberabuse is estimated to have cost our economy up to $3.7 billion to date just in medical costs and lost incomes for Australians. So for those out there who may be listening and thinking, 'Why is a Liberal government getting involved in this space?' even if you just put aside the fact that this is the right thing to do for the moment, there is an economic cost to Australians.

                For people who are struggling with bullying, depression and social isolation, the online world can leave them no respite from their suffering. In 2021 the internet has no off button. Mobile phones and constant connectivity can mean that the bullying never stops. We see this, unfortunately, in politics. We see this in our own communities. Comments can be anonymous—they can be brutal—and the isolation can be unrelenting. At its worst, vulnerable people can end up being encouraged to take their own lives. Let me say that again: vulnerable people can be, and in fact are, encouraged to take their own lives.

                We talk so much in this place. The discussions in recent days have been about all of the issues that women are experiencing in this country, but let's just go back a step for a moment. If I treat the member for Lyons with the respect that he deserves, I'm hoping that he will return that. If these last few weeks teach us anything, it's that we are in positions of leadership in this place and we should be treating each other with greater respect than we do.

                With eating disorders, a different, though no less dark, side of the internet is on display. I've seen this personally. There are a plethora of websites and forums in which sufferers and enablers share information on how to lose weight in a very unhealthy manner, putting other peoples' lives at risk. These sites offer those suffering from eating disorders tips and tricks. They discourage treatment and normalise what we now know as a serious mental health disorder.

                We must do more to protect vulnerable people, and particularly we must do more to protect children online. To deal with these ever-increasing social challenges, ultimately we will need action from the online media companies which facilitate abuse and illegal sharing to better moderate their platforms. Like many regions around the country, my electorate of Fisher includes nearly 10,000 young people aged between 18 and 24 who are particularly at risk, as well as thousands more students at 13 secondary schools who are regrettably increasingly facing cyberabuse of all kinds. I have, therefore, and based on my own family's experience, taken a very strong interest in this issue.

                I'm now very familiar with dealing directly with the world's biggest digital and social media companies on their approach to cybersafety. Since my election I've been pursuing change in this area and, in particular, the reduction of cyberbullying and the protection of those online with serious mental health concerns, including eating disorders. With the support of the then Prime Minister, the eSafety Commissioner and the then National Cyber Security Adviser, Alastair MacGibbon, in January 2018, just after Dolly Everett took her own life, I had the opportunity to meet with the DIGI Group, a group of global online businesses, to speak to them about the ideas for change that I believe could make an important difference. This group includes representatives of Google, Facebook, Twitter, Microsoft, Yahoo and others. Though our conversations then and in the years that followed have been constructive, some of these companies have made slow progress in improving safety. The message from industry time and time again has been the same: if we want to see real change in keeping Australians safe online then we need to legislate and force these businesses to act.

                Unfortunately, we've seen in recent weeks more clearly than ever the attitude of some of these major technology companies and their reluctance to act on important social issues when they're not forced to do so. Facebook is the platform where, the eSafety Commissioner reports, the most Australians have negative experiences. Only weeks ago, Facebook responded to entirely valid concerns about the benefits it was receiving without payment from the publishing of professional news content on its platform by depriving Australian users of a key part of its service. Instead of dealing constructively with Australians and their democratically elected government, Facebook attempted to bully and coerce Australians into what they wanted. If we ever needed proof that we require a strong regulator in this environment, to make sure that these companies do the right thing, we now have it. We simply cannot rely on the good intentions of these companies; we must have a strong cop on the beat ready to step in and force change when it is needed, and we need stronger expectations on the platforms that reflect Australians' expectations.

                The bills before the House today, the Online Safety Bill 2021 and the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021, will go a long way towards ensuring that a strong regulator is in place on an ongoing basis, and that it has all the powers it needs. It is the latest measure in this coalition government's unmatched track record of action on online safety. It was, after all, the coalition that established the Children's eSafety Commissioner in July 2015. This government expanded its role to encompass e-safety information and support for all Australians. As far as I'm aware, it remains the only organisation in the world that does this; we were world leaders.

                To date, this government has delivered more than $70 million for the work of the eSafety Commissioner, Ms Julie Inman Grant, who I've done a lot of work with in the last five years. She has done a tremendous job in using those funds and exercising the powers that she has to make Australians safer online. Just in 2019-20 the commissioner received 14,500 reports about prohibited online content and identified more than 13,000 separate URLs which were sufficiently serious to warrant referral to law enforcement. In 82 per cent of cases, the commissioner was successful in having material removed at her request. However, she also issued 16 notices to Australian and overseas services to remove abhorrent violent material. In one year the commissioner received almost 700 complaints about serious cyberbullying targeting Australian children and referred almost 3,000 people to the Kids Helpline.

                However, the online world is constantly changing and our regulatory framework must adapt to keep up. New technologies and new ways of using that technology have shifted the threat landscape, while the role of materials and services hosted online are becoming more recognised. In short, as I've often said and as this government says, the standards we apply to behaviour in the online world must be the same as those we apply in the real world. If something is unacceptable in real life, it should be unacceptable online.

                The 2018 independent review of the Enhancing Online Safety Act, led by Ms Briggs, laid out a clear path for how to improve our regulations. The bills before the House enact those recommendations, and I'll speak just about a few. First of all, the bills expand the range of situations in which the eSafety Commissioner can exercise her powers. They extend this government's already world-leading cyberabuse take-down scheme from content relating to children to that involving all Australian adults. Under these bills, where an online platform has failed to take down abusive material, the eSafety Commissioner will have the power to issue take-down notices not only to the platform itself but to the end user, where identifiable. These bills also expand the range of online providers to which the eSafety Commissioner can issue a notice to take down. Although social media is critical in the struggle against online abuse, other platforms, including search engines, online games, websites and messaging services, are seeing increasing cyberbullying. It's even happening now with online banking. With these bills, we empower the eSafety Commissioner to take the fight into these new battlegrounds.

                As well as broadening the commissioner's powers, the bills deliver tougher measures to deal with the most serious breaches. One of the most prevalent and harmful forms of online abuse is the non-consensual sharing of intimate images. I'm glad some kids up there in the student gallery have joined us. You might want to listen to this part. It needs to be dealt with quickly and comprehensively, and this bill halves the amount of time allowed to platforms before they must remove such material once a notice is issued by the eSafety Commissioner. Those who don't comply face a civil penalty of up to $111,000.

                When it comes to material described as class 1—that is, material involving the most abhorrent violence and abuse—this bill would give the eSafety Commissioner the power to have it removed from search engines within 24 hours and even to have apps deleted, where those apps are facilitating access. Where events of an abhorrent nature are ongoing, this bill would give the eSafety Commissioner the unprecedented but important power to have websites that have been used to share live images of these events temporarily blocked by all internet service providers in this country.

                To me, perhaps the bill's most important provision, however, begins to correct a cultural issue which I believe lies at the heart of the challenge posed by online bullying and harassment, and that is the question of anonymity. Studies have proved that the feeling of anonymity facilitates abhorrent behaviour on the part of people who would never consider it if their names and their faces could be seen. Behind a shield of anonymity, malicious actors feel invulnerable and, in all too many cases, they have been. I believe this cannot be allowed to continue. We must hold individuals accountable for their actions online, just as we hold them accountable in the offline world, and, to do that, we must be able to identify who they are, and this bill will do just that. I commend it to the House.

                1:11 pm

                Photo of Brian MitchellBrian Mitchell (Lyons, Australian Labor Party) Share this | | Hansard source

                I am pleased to have the opportunity to speak on the Online Safety Bill 2021 and the Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021. Members may recall that in recent weeks I've been making numerous calls for action to be taken against the foreign based website Etsy, which had on its platform abhorrent products promoting child sexual abuse, putting these products for sale alongside things like Father's Day gifts. I'm certainly on the side of cracking down on rogue internet operators who fail to uphold community standards.

                Labor does support measures to consolidate, update and enhance online safety laws for Australians, and the bills do seek to create a new online safety framework for Australians—'a modern, fit-for-purpose regulatory framework that builds on the strengths of the existing legislative scheme for online safety'. For many years, Australians have been protected by laws to support online safety, and it is important these laws are kept up to date. To a large degree, the bills represent regulatory housekeeping by consolidating various online safety laws into one bill, with some updates, in response to Lynelle Briggs's independent review of online safety laws, which reported to the government in October 2018. So the government's had this for a while and has not acted with the speed with which it should have. Labor does support the aim of consolidating online safety laws into a new framework and new elements of the bill, including the creation of a new complaints-based removal notice scheme for cyberabuse being perpetrated against an Australian adult and the articulation of a core set of basic online safety expectations to improve and promote online safety for Australians.

                Although the government has been spruiking this new Online Safety Bill as if it already exists—and it has been spruiking it for almost two years—it's just now that it's introducing a bill that addresses a recommendation of the Briggs review for a single, up-to-date online safety act. Unfortunately, even after all the time the government's had, it still doesn't have this bill quite ready to go. Labor are supporting it because we're taking the view that something's better than nothing, but this bill really is half cooked, and the government should have done a better job. It's a very important piece of public policy, and the government's just come at it almost as an afterthought. When asked about key operational aspects of this bill, such as for the novel adult cyberabuse scheme, even the eSafety Commissioner referred to it as a 'sausage' that is still being made, and that was as recently as last week. Last week, the eSafety Commissioner said the sausage is still being made. It's really not good enough.

                Given the significant passage of time since the Briggs review reported, it is disappointing that the government has proved incapable of conducting a process that satisfies stakeholders in terms of both process and substance. A wide range of stakeholders still have valid concerns with this bill which the government has failed to address. Submissions to the Senate inquiry into the bill raised concerns about several matters that they argued have not been addressed and which some considered long overdue for reform, such as the functions, powers and oversight of the eSafety Commissioner, the clarity and breadth of the basic online safety expectations, and the public consultation processes associated with the bills, amongst other things.

                I personally have some concerns about things like mission creep. We start out with good intentions on these things, and I do back this bill. As I say, it's not ideal but it's better than nothing. But I am concerned that, given the holes in this bill and the patent fact that it's going to need pretty quick revision pretty soon, there is a risk of mission creep, where what starts off with good intentions ends up being something a little bit more insidious. I have a background as a journalist. I would hate to think that the provisions of this bill would end up in any way impacting on the ability of journalists to report freely and openly in this country. Journalism, by necessity, often presents the public with confronting images. For example, who can forget that terrible image of George Floyd being kneed in the neck until he was dead? It's not beyond the scope of reason to think that somebody may well complain to an eSafety Commissioner in the future that that sort of imagery is abhorrently violent and should be caught up in the scope of this bill and taken down. I have very high regard for the current eSafety Commissioner, but she won't always be there. Who knows who will hold that position in the future? Who knows what qualifications they will bring and what ideological or political baggage they might have trailing behind them when they are given fairly extraordinary powers under this bill?

                So I do have those concerns. I don't think those concerns are unwarranted, and I hope that, once this goes up to the Senate as part of that review process, it's taken on board a little bit more. We need to make sure that journalism and particularly the public interest are not caught up by bad-faith actors, people who make vexatious complaints to an eSafety Commissioner for either religious or ideological reasons or simply because they want content that they happen not to agree with taken down. So we need some more work done there.

                Other significant matters of concern include how the bill interacts with the frameworks of safeguards put in place by the telecommunications assistance and access regime, as well as matters that interact with freedom of speech. That goes to some of the matters I've just raised. Finding the balance between free speech and protections against certain kinds of speech is a complex endeavour. It's not easy. We all like to think that we live in a free society with free speech. We don't have free speech mandated in our Constitution, unlike in the US, but we like to think that we have freedom of speech in this country. But it comes with obligations and responsibilities, and finding that sweet spot is a difficult endeavour.

                We are concerned that this bill represents a significant increase in the eSafety Commissioner's discretion to remove material, without commensurate checks and balances. This is a key concern of mine: an unelected official is being given extraordinary powers over public information. As I say, this is being done with good intent—I don't question the government's intention on this—but an unelected bureaucrat is being given extraordinary powers, and I really think we need to look very closely at how that person is appointed, what qualifications they bring and also what oversight this parliament has. I am not a fan, personally—I can't speak for the Labor Party on this—of the executive having sole discretion over this. A minister having the hire and fire power over an eSafety Commissioner does not sit well with me. Finding that balance is important.

                Whilst we are supportive of a scheme for adult cyberabuse, it's very curious that a government that has made repeated attempts to repeal section 18C of the Racial Discrimination Act on the grounds that it unduly restricts free speech, despite the availability of defences in section 18D, is now seeking to rush through a half-cooked bill that empowers the eSafety Commissioner with discretion to determine matters of speech in relation to adult cyberbullying without greater checks and balances or operational clarity. They've had this for two years. It's almost like they've just let it sit there and percolate, doing nothing, and now they're rushing it through the parliament not having done the checks and balances properly.

                This government talks a big game about its expectations of social media platforms, but, to date, it has failed to do its job by updating Australia's online safety laws. While the government is right to expect digital platforms to offer more in terms of transparency—and I'm not a fan of the way those digital giants do their job; they need to do better—so too must the government be prepared to provide transparency around decision-making, particularly on matters that engage with human rights.

                In their additional comments to the Senate inquiry, Labor senators recommended that the government consider further amendments to clarify the bill in terms of its scope and to strengthen due process, appeals, oversight and transparency requirements given the important free speech and digital rights consideration that it engages. Online safety is of increasing importance to all Australians as we spend more time online—it's just everywhere in our home lives and in business—and I urge the government to take the short amount of time it takes to get it right. You've had two years. I don't know why you're rushing it through now without taking account of the many complaints and many concerns that have been raised through the Senate process. Get it right the first time. Don't rush this through and then have to cobble it back together again with amendments over the next year, as you are doing with other bills. I think the biosecurity one comes to mind. You made a mess of that, and now the biosecurity amendment bill is coming back into the parliament. Get it right the first time.

                Labor supports online safety and we do support these bills. The safety of Australians online, particularly children, is of paramount importance, so Labor will work constructively with the government to iron out concerns with these bills in time for debate in the Senate. But, in the meantime, Labor will not oppose these bills. We just implore the government: do better.

                1:22 pm

                Photo of Julian SimmondsJulian Simmonds (Ryan, Liberal National Party) Share this | | Hansard source

                It's a great pleasure to rise today and speak on this bill, the Online Safety Bill 2021, and to support it very enthusiastically and wholeheartedly. This is obviously a passion of mine. I'm a young dad and, like a lot of families in my electorate of Ryan, I'm passionate about making sure that our kids are not only safe when they're online but also supported in the real world from the harmful effects that can come from negative behaviour online.

                Before I go on, I really wanted to pay tribute to the member for Forrest, who is in here watching—the wonderful Nola—because I'm a recent addition to this place and it has been a great honour of mine to come into this place and be able to take up this issue of online harm. The member for Forrest is one who I know—and there are others—has been taking up this particular fight for some time. I think that, long before the full negative effects of it were fully appreciated, the member for Forrest appreciated them and not only took them up in a virtue-signalling way but also took them up in a very practical way by working with her schools and taking it upon herself to actually educate young people about what is safe behaviour online. So, Member for Forrest, you should be congratulated for that.

                Our kids are continuously spending more and more time online, and there are some very harmful environments online. I have to say technology continues to evolve faster than we are evolving in our understanding of it. Sitting on the Standing Committee on Social Policy and Legal Affairs and looking at those recommendations into online gambling and access to online porn, it struck me that it's not just about messaging apps or Facebook or Instagram, which we all understand relatively well—me not as much as I should, because I'm not a tech-head. There are things like loot boxes in games. There are things like chat functions, which kids can use to chat to people. Kids can be playing these games, in front of their parents, in full view. A parent can think they're doing the right thing by watching over their kid, but they don't know who is on the other end of the earpiece or what they're speaking about. It's a very dangerous space for parents. I'm struck by the fact that it's a space in which parents can be incredibly diligent and feel like they're doing everything right but still get it wrong. That is a really dangerous space. They'll get it wrong because (1) it's a struggle for parents of any generation to keep up with the technology of a younger generation and (2) these predators are incredibly sophisticated and determined in the horrible things that they are trying to do. The point I'm trying to make, and where I'm going with this little monologue, is that this bill is important because it gives parents the tools to help their kids in a way that they couldn't before. As we have seen, the eSafety Commissioner has had significant success with younger people. It also gives parents and adults the opportunity to push back against some of the trolls and the abhorrent online behaviour that we see through social media and other technologies.

                To understand the prevalence of the issue, I want to give this House a little bit of an understanding of what we're up against when we're talking about vulnerable people in our society and the harm that can come from technological abuse. Recently, the Australian Centre to Counter Child Exploitation, known as the ACCCE—it's a tremendous facility hosted in my home town of Brisbane, and I'm incredibly supportive of it—completed a study into understanding community awareness, perceptions, attitudes and preventive behaviours. They found some remarkable things. Four out of five children aged four are using the internet, and 30 per cent of those children have access to their own device. Incredible! Again, this isn't about lax parenting, although, from my own experience with my child, I know I'd be very careful about providing a four-year-old with a device. Some parents think they are doing the right thing by giving their child an internet enabled device, opening up all sorts of educational and learning opportunities for that child. But with that opportunity comes an incredible responsibility and an incredible potential for harm. One in two children under the age of 12 have their own device. Despite this, only 52 per cent of parents talk to their children about online safety. By the age of 11, most children are using the internet unsupervised. This is incredibly concerning. At the age of 11, the cognitive abilities of these kids are limited. Their ability to make sensible, rational decisions and to understand the full scope of the decisions they're making when they enter into conversations that could be harmful to them is limited. They don't fully understand. That's alright. That's an opportunity for parents and our entire society. The saying is: 'It takes a village to raise a child.' We all have to throw our arms around these parents and kids to make sure that everybody understands the harm that can come from online technological abuse, whether it be to kids or to parents.

                I take the point of the previous Labor speaker. It is a difficult issue. But, with due respect, when he says, 'It's hard to find the balance,' I'm not so sure it is. I think the balance is quite clear. We expect the same rules, laws and norms in the online world as we expect in the real world. Up until now, we have been willing to forego those protections in the online world because we felt that, to get the benefits of all this increased connectivity between people in a globalised world, we had to give up the protections that we have in the real world. That's just not the case. I don't think that Australians are willing to accept it anymore. I don't think that they will cop it anymore. What the government has shown, by moving against the technology companies when it comes to having them paying for news content, demonstrates our desire to make sure that the same rules apply in the online world as they do in the real world. Frankly, I think these technology companies should be held accountable as the publishers that they are. It's not acceptable to have anonymous content on these platforms. If these platforms do allow anonymous content then they have to take responsibility for what is said in that anonymous content.

                Photo of Rob MitchellRob Mitchell (McEwen, Australian Labor Party) Share this | | Hansard source

                Order! The debate is interrupted in accordance with standing order 43. The debate may be resumed at a later hour.