Australia’s Online Safety Act, which was passed in July 2021, has come into effect from Sunday allowing adults to report cases of online bullying in the country to the eSafety Commissioner, Julie Inman Grant. The act empowers the eSafety commissioner to order social media websites to take down content related to bullying against Australian adults within 24 hours, and if not then face heavy fines of up to $555,000. The law defines what counts as bullying and is more detailed than current cyber laws in the country. It has decreased the time frame for the removal of bullying content from the present 48 hours to 24 hours. While bullying laws for children already exist, it addresses bullying against adults as well as children. What is the Online Safety Act? With the new law, it is going to be the responsibility of the eSafety Commissioner to ensure the implementation of the act that aims to promote the online safety of Australians. It will also be the responsibility of the commissioner to supervise the complaints and objections regarding sharing intimate images without consent, manage the online content and coordinate “activities of Commonwealth Departments, authorities and agencies relating to online safety for Australians”. After a complaint against “the provider of a social media service, a relevant electronic service or a designated internet service” is registered, they would be given a removal notice. A removal notice would also be sent to the hosting service provider or the person who had posted the cyber-bullying or abuse material. The provider or person would also be asked to "refrain from posting cyber‑bullying material or apologise for posting the material”. The same process would work in the case of the non-consensual posting of intimate pictures. The person who would post or threaten to post would be liable to a penalty. If the content or post is not removed within 24 hours, a fine of up to 500 penalty units will be applied — for individuals, up to $111,000 and for organisations up to $555,000. The act also allows the eSafety commissioner to send a link-deletion notice to search engines to stop providing links to certain material as well as app distribution services to stop letting users download an app that enables posting bullying or abusive content. The commissioner also holds the ability to make rules that would regulate digital service providers along with the digital content. The material would be considered to be bullying or abusive if it is offensive, depicts abhorrent violent conduct, immoral, indecent, threatening, intimidating, harassing, humiliating, intimate image and non-consensual image. “Serious harm could include material which sets out realistic threats, places people in real danger, is excessively malicious or is unrelenting,” said Grant, who is known to be working towards a mechanism, which would require age verification to access adult content online by the end of this year. The Commissioner is responsible to ensure the content is removed, but legal action doesn’t come under their jurisdiction. “This is a world-first scheme. From the end of January we will be able to act as a safety net to give Australian adults who have been subjected to serious online abuse somewhere to turn if the online service providers have failed to act in removing the abusive content,” Grant said. The act would also give the eSafety commissioner the power to block certain online content altogether, this could include violence, so that no one can access such content in the country. What would be the process to be followed by someone who is being bullied or abused online? First, it is important to report the content online, on the social media app they are being bullied or abused on. Filing a complaint with the police should be the next step, since online harassment is still a crime. Then, if the content is not removed, the concern can be reported to the eSafety commissioner would further issue a notice for the removal of the content within 24 hours, post which a fine would be imposed. “If a platform fails to take action, people can come to us to make a report. Our new investigative and information gathering powers will allow us to investigate and assess complaints, and decide what action we can take,” said Grant. Who is the eSafety commissioner? Grant has spent two decades working in public policy and safety roles at organisations like Twitter, Microsoft and Adobe. Before getting into the public policy sector through working at Microsoft, Grant used to work with the US Congress in Washington. After Washington, Grant began her 17-year-long career at Microsoft as government relations professional and she went on to become the Global Safety Director for safety policy and outreach. After Microsoft, Grant joined Twitter where she developed and was in charge of Twitter’s policy, safety and philanthropy programmes in Australia, New Zealand and Southeast Asia. Australian Financial Review designated Grant to be one of the country's most influential women. World Economic Forum, in 2020, designated Grant to be Agile 50, which recognises leaders in agile governance. About the Online Safety Act, Grant said, “This ground-breaking scheme gives us the ability to help those Australian adults who have been subject to the worst types of online abuse, which is becoming an all-too-common occurrence. If a report meets the threshold, we can issue a notice to the platform to get that harmful content removed.” Have any other bills on digital abuse and trolling been passed in Australia? There have been a few other legislations and bills that have been drafted against social media organisations and even online trolling. Australian Competition and Consumer Commission in 2021 released a report targeting Google and Facebook and the advertising on these platforms. As a result, these tech companies faced stricter regulations on advertising and the algorithm that decides what appears in the online feeds of users. The report also recommended stricter regulations based on the privacy of users’ information. The online trolling bill, if passed, would allow the government to hold social media companies responsible for the trolling comments and content posted on their website. The bill would ask social media organisations to unmask the identities of trolls and if these organisations fail, would penalise them. Newsletter | Click to get the day's best explainers in your inbox