Child welfare organization Australian online safety regulators are calling on Meta and TikTok to reject industry codes they claim will keep young users safe, and industry-created rules have done little to improve safety and have been criticized internationally. I warn that we are falling behind in our efforts.
The Online Safety Commission is now considering registering an industry safety code developed last year by internet companies to regulate the handling of certain online content.
The self-regulatory approach, part of the controversial online safety law, has been criticized for offering relatively weak protections after eight draft codes were released last year.
Among them are social media and related electronic service codes that set rules for how platforms such as Facebook, Twitter and TikTok should handle harmful content and protect young users.
A coalition of child safety groups, made up of the Australian Children’s Rights Task Force, ChildFund, Bravehearts and Reset Australia, told eSafety Commissioner Julie Inman Grant on Wednesday that it would not register the proposed code and discuss alternatives. I asked to meet with the group.
“Neither the social media version nor the associated electronic service code enforce existing safety standards,” the letter to Ms. Inman Grant said.
“Rather, they seem to document the status quo and focus on practices that are already in place. I impose.”
The group hopes that the proposed social media industry code, and how to designate services under it, will allow some companies to stop the current practice of scanning their services for child sexual exploitation and abuse material. It states that safety standards can be lowered, such as
The proposed code would also allow less protection than would be required in other markets where regulator-driven rules are in place, the letter said.
Of particular concern is Australia’s approach to default settings for users aged 16-18.
In the UK, Ireland and California, regulators or legislators have established rules for default privacy settings, but social media companies must apply stricter default privacy settings for children aged 16-18 .
However, the Australian code only requires default privacy settings for users under the age of 16, so some Australian teens are “not very protected”.
” [industry code] The version under consideration will allow these platforms and services to effectively “turn off” child-friendly safety features and options in Australia. This would make us less safe online,” the letter said.
The group is asking Inman Grant not to wait for a privacy law review before deciding on a code.
“Safety and privacy are a connected experience. Private accounts for minors are not recommended as ‘friends’ or accounts for ‘following’ adult strangers. As Meta found, 75% of “inappropriate contact between adults and minors” on Facebook was due to the “People You May Know” friend recommendation system.Attempts to lock in lower standards while awaiting review Personal Information Protection Law An unacceptable approach to safety. “
Officials at the eSafety Commissioner’s office said the regulator had provided some feedback to the industry on the draft code at a public hearing in November.
Industry codes were to be submitted to the Secretariat by November 18th.
The eSafety Commissioner’s office has been asked to provide a copy of the draft code to the Senate and has been notified and requested to do so. The answer has not yet been published, but a version of the draft code is available online.
do you know more? Please contact James Riley by email.