House debates

Tuesday, 16 March 2021

Bills

Online Safety Bill 2021, Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021; Second Reading

4:33 pm

Photo of Adam BandtAdam Bandt (Melbourne, Australian Greens) Share this | Hansard source

This week in Canberra, we've been reminded just how significant are the problems of violence that we've got to tackle in the real world. Of course, the Prime Minister hasn't even had the courtesy to go out and meet the thousands of women who came to this place to march and demand justice and change. But we're also reminded this week that, just as there are areas in the real world that we need to change—and it's predominantly men who need to change their behaviour—there are changes required in the online world. The online world is not exempt, and, in fact, in many instances, because of the speed at which things can happen online, approaches are needed there as well. We joined in the condemnation of the streaming of the terrorist massacre in Christchurch, for example. That streaming was something that was abhorrent and should not have occurred.

Similarly, as previous speakers have mentioned, the issue of cyberbullying is something that requires tackling as well. In many instances it overlaps with what has historically been known as bullying, and in other instances there are specific ways that cyberbullying takes place. Of course, we see again that, because of the speed with which digital images can be transmitted and the capacity to do so, the non-consensual sharing of intimate images is something that has caused significant grief to so many people, including so many women.

So I think you would find, across this parliament, not one person who says there are no problems in the online world. You will not find one person across this parliament who says there are no problems of violence or exploitation that need tackling, in the same way we do with respect to the offline world. But what is of great concern in those very significant areas that need to be addressed—including the protection of children, which is a critical duty for all of us—is that, when the standards that one might apply with respect to the protection of children—the limiting, for example, of what children may see—slip over and start to be applied to what informed, consenting adults are able to do, it raises issues, and we must tread carefully. We must tread incredibly carefully, because if we don't we could find ourselves, as the parliament, saying to informed, consenting adults, 'We are now going to allow government to determine what you can read and do and participate in online.' That is why it is very critical that we get the balance right—because a lot is at stake if we get it wrong. It will inhibit the freedoms and the rights of people in this country, of adults in this country, to participate and read online in the way that they choose.

When there are 370 public submissions on the exposure draft of a bill, with significant amendments proposed, and the government come in here 10 days later with a bill that has zero amendments to address those significant concerns, and then they rush this bill through this place after an inquiry process lasting less than two weeks, where many of those same concerns were raised, it raises the presumption—in this case, the right presumption—that the government have not got the balance right and are engaged in huge overreach here. I suspect that everyone in this parliament would support the important aims of protecting children, addressing the non-consensual sharing of images, and dealing with the rise in hate speech and hate crimes and their perpetration through online media. You would find support for that if you took a measured approach to finding a way of dealing with that that did not infringe on other people's rights. But that is not what the government has done. Instead, we have a bill—the Online Safety Bill 2021—that gives an unelected official significant power to determine what people can read online in this country.

I make no comments about the particular person who at the moment occupies the role of eSafety Commissioner. I'm not commenting on that particular person at all. I'm talking about the principle of whether it is right that very broad, generalised and undefined standards—things like 'basic online safety expectations', which is the phrase used in this bill, and broad brushstrokes about definitions of things like 'harmful'—should be delegated to an unelected official who then has the power, armed with that, to determine what is allowed to remain online or not. Under this bill, that unelected official will have the power to take down material that they think does not comply with these very broadly defined standards and to require that internet service providers and others who are hosting online content comply.

This is a very broad, proactive power to go out, search, find and order that things be taken down. When such broad powers are given to someone who is an unelected official, what you will find is that people in the industry will start censoring themselves for fear of being subject to one of these orders. They are going to start taking down things that might not contravene the law, just for fear of being subject to an order according to some very broadly defined principles. In this respect, Electronic Frontiers Australia made a submission to the Senate inquiry, and they are right when they say:

This is a breathtaking amount of power to be handed to a single person, regardless of the level of oversight. The severe lack of checks and balances over the exercise of power granted by the Bill only compounds the danger. Granting extraterritorial jurisdiction over all Internet content to an unelected person appointed by the government of the day is an astounding proposition in a country that holds itself out as a liberal democracy.

It is worth noting that there is almost no way of challenging the decisions that this person makes. In almost every other sphere of our society, of the laws we pass, if an official does something, especially when it's something as significant as saying, 'You are not allowed to read this thing online,' there's a capacity to appeal and to review such decisions. But here that's almost non-existent.

As others start to pre-empt this outcome and restrict content, and as the chilling effect of this legislation flows through to areas that it was not intended to flow through to, a lot of these take-down approaches are going to be automated. It's not going to be an individual person sitting there clicking and browsing and looking at everything; it will be automated. What we know is that computers and algorithms are not always that good at distinguishing a violent image that people would find abhorrent from, say, an image of a police officer kneeling on a black man's neck, and so images that are designed to hold power to account will also find themselves the subject of potential censorship under this legislation. That poses significant risks to groups like the Black Lives Matter movement but also to anyone who is fighting for change and who wants to use the internet as a platform for broadcasting abuses of power. They could find themselves falling foul of an unelected official's sense of what is appropriate. Then, when the official—or algorithm—takes it down, there is nowhere to go; they're censored. When similar legislation was passed in the United States, sex workers who had shifted their legal activity online during the pandemic found themselves caught in a situation where, although the work they did was legal, they could not pursue it online, because of restrictions of this type. That has been the experience overseas.

There are a wide range of groups that are going to be affected by this legislation. To the extent that we are talking about safety, it's worth remembering what Digital Rights Watch said during the very short inquiry that was allowed. They made the point that, when sex workers are forced offline, they are often pushed into unsafe working environments, which creates direct harm. So this bill has the capacity to do harm, which is part of the reason it should not be rushed through. We oppose the government rushing this thing through when so many people have raised so many concerns about it. Domestic Violence Victoria made the point in the consultation on the exposure draft that the complaints process and the lack of a review of it as part of this legislation, in their words, 'provides opportunities for vexatious and malicious use of technology by perpetrators to further perpetrate family violence'.

Then there's the prospect that, as currently drafted, the bill could provide to the commissioner the power to limit, restrict or undermine encrypted services and communications, as well as information-gathering and investigative powers. The extension of this to encrypted services suggests that the government is going well beyond what it says on the tin, because it's not about what people can see if they happen to be surfing the internet; it is now about going into private messages that no-one could say a child could see or is able to see. They're talking about accessing potentially encrypted services.

It's astounding that a government that is running away from any independent inquiry into its own actions—when it comes to questions of alleged violence—is saying, 'Oh no, we expect to have access to other people's secret and private communications, and we'll do it under the guise of this bill and we'll do it via an unelected official, and there will be limited rights of recourse.' Of course, they have form on this, because this is the approach that was taken with respect to the assistance and access act, something that we opposed as well.

There are two things we need to do in this country. The first thing we need is for the government to withdraw this bill and come back with a redraft that deals with the issues that you would find acceptance of across the whole of parliament—about how we deal with non-consensual sharing of images, how we deal with cyberbullying, how we deal with those images of hate crimes that get broadcast online. Go back and come up with something that deals with those key issues. But don't let it overreach and step into dealing with people who are committing no crime, who have never been accused of committing a crime and who are going about their lawful activity online. Don't let it dictate what they can or can't do online, because that is what this government is doing—rushing a bill through that will allow for the censorship and regulation of content from people who have done nothing wrong and who are participating online. All of a sudden, an unelected official is going to tell you what you can and can't see.

The second thing that we need to do in this country is institute some basic privacy and digital rights that are leading standards, something on a par with the European Union's General Data Protection Regulation, or GDPR. If we had some basic digital rights enshrined in this country, then you could have a sensible debate about things like what the government is proposing, because people would know that their rights were protected. But at the moment we can't know that. Why does the government want to go beyond the stated intent and name of the bill and start regulating, in an unacceptable way, what adults are able to do online? It is part of creeping moves to exercise greater power over our freedoms and responsibilities, and that's why in its current form, unless it's withdrawn and redrafted, the bill cannot be supported.

Comments

No comments