Home

New powers for online safety body begin

Maeve BannisterAAP
New online safety measures to protect Australians on the internet will come into force this weekend.
Camera IconNew online safety measures to protect Australians on the internet will come into force this weekend. Credit: AAP

New online safety measures to protect Australians on the internet will come into force this weekend, giving the national regulator more powers to take action against abuse.

From Sunday, the eSafety commissioner will be able to compel tech companies to consistently report how they are responding to online harm.

The timeframe within which platforms are required to respond to a 'take down' notice from the commissioner will be cut to 24 hours.

The changes come in to place as the federal government investigates further potential online safety measures.

Get in front of tomorrow's news for FREE

Journalism for the curious Australian across politics, business, culture and opinion.

READ NOW

It wants to introduce laws that would force social media platforms to take down offending posts and, in some circumstances, reveal the identity of anonymous posters.

But social media companies want the government to see the effect of the eSafety commissioner's new powers in addressing online abuse before introducing further measures.

Representatives from Twitter spoke on Friday after its submission to a parliamentary committee urged the government to reconsider the time frame for introducing new measures.

"Twitter recognises the need to balance tackling harm with protecting a free and secure open internet," the submission said.

"The potential consequences for hasty policy decisions or rushed legal regimes will stretch far beyond today's headlines, and are bigger than any single company."

But mental health advocates say social media and big tech companies have been allowed to write their own rules for too long.

Mental Health Australia chairman Matt Berriman says the idea social media companies will do the right thing when it comes to keeping people safe online is a "fallacy".

"Quite simply, big tech and social media companies will put profit over people," he told the committee on Friday.

"We cannot trust big tech and social media to moderate themselves despite their many hollow promises."

Twitter policy director Kara Hinesley echoed Google and Facebook representatives who told the committee it was important to look at context when assessing abusive content.

"Within a younger demographic there are terms that are used in a way that might seem on its face abusive but at other times would be seen as appropriate language between people who know each other," she said.

"Whenever there would be that word or phrase used to directly target an individual that could run foul of our hateful content policy ... that's where detail and more information is important."

The committee is due to report to parliament in mid-February with further recommendations for online safety.

Get the latest news from thewest.com.au in your inbox.

Sign up for our emails