Social media companies will be required to take proactive steps to keep Australians safe online under the Federal Government’s plan to legislate a ‘digital duty of care’.
This marks the Albanon government’s latest move to shift the responsibility of ensuring the safety of users, especially children, on their platforms to the shoulders of social giants.
The digital duty of care was recommended in an independent review of unpublished online safety legislation submitted to the government last month.
This follows similar moves by the UK and European Union to require platforms to move from reacting to harm to taking reasonable steps to prevent foreseeable harm.
Communications Minister Michel Rolland said this obligation would build on the existing complaint and takedown regime under the Act.
“What is needed is a shift away from relying solely on content regulation to respond to harm, to broaden our perspective on what online harm is, and move towards systems-based prevention,” she said. said.
Michelle Rowland says social media platforms need to move from responding to harm to preventing harm. (ABC News: Ian Cutmore)
Strong penalties for companies that do not take precautions
The new obligations come in addition to the government’s move announced last week to ban children and young people under 16 from using social media.
Government’s war on big tech
The Al Albanon government is increasingly focused on the escalation of online harm, including explicit and hateful content.
Rowland said changes were needed to focus more on how content can negatively impact mental health.
“The Albanon Government has made it clear that it stands with the millions of concerned parents, children and the nation as a whole,” she said.
“This is part of a growing global effort to provide a more systematic and proactive approach to making online services safer and healthier.
“We ensure that regulators can impose strong penalties if platforms materially and systematically breach their duty of care.”
The government wants to impose strong penalties on platforms that do not take precautions to keep people safe. (Reuters: Gonzalo Fuentes)
The minister said the new obligations mean platforms must continually assess and take precautions to mitigate potential risks.
Ms Rowland made the announcement as the government faces another blow from another bill aimed at curbing the spread of disinformation and misinformation online.
The government’s chances of passing the Misinformation and Disinformation Bill were dealt a further blow on Wednesday after independent senator David Pocock announced he would oppose the bill, leaving the government with a narrow path to passing it. It became.
Digital mandate follows international efforts
This global initiative includes EU digital safety laws that require care in the design and operation of platforms to ensure that people are not harmed when using them.
Companies found to have violated this law could be subject to fines of up to 6 percent of their annual revenue, and in the case of platforms like Meta, the total could reach hundreds of millions of dollars. .
The UK has also introduced legislation that will not only restrict access to pornography, but also hold platforms accountable for preventing access to content that promotes suicide and eating disorders.
The Albanon Government said the proposed changes would make Australia a world leader in online safety.