Home

Tech giants face online safety inquiry

Maeve BannisterAAP
Global tech giants will front a committee examining how to make social media safer for Australians.
Camera IconGlobal tech giants will front a committee examining how to make social media safer for Australians. Credit: AAP

Some of the world’s biggest tech companies will have their say on proposed laws that would hold them accountable for online harassment and abuse on their platforms.

The proposed laws would force social media platforms to take down offending posts and, in some circumstances, unveil the identity of anonymous posters.

On Thursday representatives from Google, TikTok and Meta - the company that owns Facebook and Instagram - will appear in front of the committee investigating how to make social media safer for Australians.

In its submission, Google Australia’s representative Samantha Yorke said the company had not waited for regulation to act in tackling illegal or harmful content on its platforms, which include YouTube.

Get in front of tomorrow's news for FREE

Journalism for the curious Australian across politics, business, culture and opinion.

READ NOW

“Our approach to information quality and content moderation aims to strike a balance between ensuring that people have access to the information they need, while also doing our best to protect against harmful content online,” she said.

The submissions from Meta and TikTok outline a number of policy measures designed to prevent abuse on the platforms.

Meta says it has cut the prevalence of hate-speech content by more than half within the past year and is proactively detecting more than 99 per cent of seriously harmful content.

TikTok says between April and June 2021 more than 81 million videos were taken off the platform for violating its guidelines.

Of those videos, TikTok says it identified and removed 93 per cent within 24 hours of posting, 94.1 per cent before a user reported them, and 87.5 per cent with zero views.

But on Tuesday criminologist Michael Salter told the committee more transparency was needed from tech giants when it comes to reporting harassment and child abuse on their platforms.

“Far too often what we’re provided from social media company reports on these issues ... is statistics that are most friendly to them,” he said.

Dr Salter said social media companies needed to accept they have a duty of care for their users.

“Having basic safety expectations built into platforms from the get-go is not too much to expect from an online service provider,” he said.

Get the latest news from thewest.com.au in your inbox.

Sign up for our emails