Perpetrators who post an intimate image without consent including deep fakes could face $111,000 penalties and online platforms will be forced to take down image-based abuse material faster under new powers given to the eSafety Commissioner.
The new rules to the Image-Based Abuse Scheme are a key part of Australia’s new Online Safety Act, which aims to better protect victims of imaged based abuse.
The updates to the Image-Based Abuse scheme come as eSafety Commissioner Julie Inman Grant this week delivered her speech at the 13th Australian Cyber Conference outlining the agency’s regulatory priorities under Australia’s new Online Safety Act which commences on January 23, 2022.
eSafety also publicly released a paper outlining the new regulatory priorities which will give industry clear insights into how it will administer its protective schemes and when and where it will wield its strengthened enforcement powers.
eSafety Commissioner Julie Inman Grant said image-based abuse affects 1 in 10 Australians and it disproportionately harms younger women from the ages of 18 to 25 where the incidence is 1 in 5.
“Under our updated Image-Based Abuse Scheme, the time online platforms get to take down image-based abuse material after eSafety issues a removal notice, is reduced from 48 hours to 24 hours,” Ms Inman Grant said.
“Under the new rules, eSafety can also name and shame platforms which allow publication of non-consensual shared intimate images on two or more occasions in a 12-month period and are in breach of their own terms of service.
“This is a way to call out platforms that aren’t doing enough to combat image-based abuse.”
Ms Inman Grant said there will also be major consequences for perpetrators who share or threaten to share sexual images of victims, images of victims without religious or cultural attire or those who use new types of technologies against victims.
“Aside from facing possible criminal charges in their jurisdiction, eSafety can seek penalties of up to $111,000 if they post or threaten to post an intimate image including those that threaten sextortion,” Ms Inman Grant said.
“The changes will also cover image-based abuse via new types of technologies as they become more popular in the future such as deepfakes and immersive technologies – all tech trends and challenges we have previously identified.”
Ms Inman Grant said eSafety helps victims of image-based abuse every day and hopes to do more to protect Australians with these expanded powers.
“eSafety investigators handled 2,687 complaints about intimate images and videos that had been shared without consent in the last financial year alone. We also have an 85 per cent success rate in removing intimate images and videos,” Ms Inman Grant said.
“The new changes mean victims of image-based can be further assured that both perpetrators and platforms will face severe consequences.”
The update to Australia’s Image-Based Abuse Scheme is among a series of new regulatory guidance to be released by eSafety between now and end of year.
The expanded Image-Based Abuse Scheme will begin operation on 23 January next year.