‘Many parents and schools feel ill-equipped to deal with the challenge of protecting children from online dangers,’ was the conclusion reached by the Federal Government’s Online Safety Working Group in November 2012. Since then, the Group has been working towards enhancing the online safety of children. One outcome has been the Enhancing Online Safety for Children Act 2015 (Act), which was recently passed by Parliament and will commence operation by 24 September 2015.
The Act in a ‘tweet’
The Act has two key objectives: the establishment of an office to take a national leadership role in online safety of children, and to protect Australian children against cyber-bullying. The Act primarily seeks to achieve these aims by:
- establishing the office of the Children’s e-Safety Commissioner (Commissioner);
- establishing a mechanism whereby harmful cyber-bullying material targeted at an Australian child (Harmful Content) can be removed from social media sites quickly;
- establishing a scheme for the receipt and investigation of complaints relating to Harmful Content; and
- providing the Commissioner with various enforcement powers with respect to both the social media site on which Harmful Content is posted and the end-user who posted the Harmful Content.
The Act, through the Children’s e-Safety Commissioner, attempts to address the conclusion of the Online Safety Working Group by bridging the gap between incidents that schools are equipped to deal with and more serious incidents of a criminal nature which should be handled by police.
Will your business/organisation be affected?
The Act will impact on:
- organisations which own and/or operate social media services; and
- organisations which are connected to social media services.
It is also important for education and welfare providers to be aware of the way the Act will operate, as this may be relevant to their clients.
Is your business a social media service?
The Act defines a social media service as an electronic service which:
- has the sole or primary purpose of enabling social interactions between two or more end-users;
- allows end-users to link to, or interact with, other end-users;
- allows end-users to post material on the service; and
- satisfies any other conditions as are set out in the legislative rules.
Provision is also made for an electronic service to be deemed to be a social media service.
The Act does not automatically apply to all services which fit the definition of a social media service. Rather, the Act identifies and applies to two types of social media services: ‘tier 1’ and ‘tier 2’ services. A service must be categorised as either a tier 1 or tier 2 service before it becomes subject to the regulatory power of the Commissioner. Therefore, a service will escape the scrutiny and powers of the Commissioner if it falls outside these two classifications.
To become classified as tier 1, a service must make an application to the Commissioner, who will declare the service as a tier 1 service if satisfied that the service complies with basic online safety requirements. The idea is that tier 1 services already have robust systems in place to handle Harmful Content internally. On this basis, the Act restricts the Commissioner to issuing requests to remove Harmful Content within 48 hours following receipt of a complaint from a child, parent or third party with the consent of the child. Any type of social media service is eligible to apply for tier 1 status.
Tier 2 services are ‘large social media services‘ which have been declared as such by the Minister on recommendation of the Commissioner but which do not qualify as tier 1. With respect to tier 2 services, the Commissioner has broader powers including the ability to issue notices to remove Harmful Content (called ‘social media service notices’) on receipt of a complaint. In addition, the Commissioner has various powers to follow up the notice including civil penalties of up to $17,000 a day, enforceable undertakings and injunctions.
While the Commissioner does not have the power to issue notices to tier 1 services, the Commissioner has the power to revoke a tier 1 service’s classification if the service repeatedly fails to remove Harmful Content over a 12 month period or no longer satisfies the internal handling requirements. This would then, as a matter of course, open up the service to receiving a tier 2 categorisation. Therefore, in a practical sense, the Act imposes a continuing duty on tier 1 providers to maintain high standards when handling cases involving Harmful Content internally.
Does your business have connected social media services?
The Act only applies to social media services which have the sole or primary purpose of enabling online social interactions between two or more end-users. Therefore, organisations that publish content which allow user commenting, liking, sharing or other social interactions (such as online newspapers and blogs) are unlikely to be subject to the provisions described above.
However, employers and employees are still subject to the general obligations of the Act. The Act gives the Commissioner the power to deal with end-users who post Harmful Content by issuing a notice to remove the content, refrain from posting further material targeted at the child, or apologise to the child. If the infringing end-user fails to comply with the notice, the Commission can seek an injunction in the Federal Circuit Court to enforce the notice. Since end-users may include employees who post on their employer’s behalf, an employer may find it appropriate to include notice of an employee’s responsibilities under the Act in their internal policies. This may be particularly important for organisations dealing with children and managing a significant social media presence.
A note for educational and welfare services providers.
If your organisation deals with children, it is likely that you already have policies in place which address cyber-bullying. Nevertheless, it may be useful to inform your wider community of the implications of the Act, including:
- that parents or other third parties who have obtained the consent of the child are able to report an instance of Harmful Content to the Commissioner;
- the relevant powers of the Commissioner as against social media companies and the end user who posts the Harmful Content; and
- the Commissioner’s role to keep parents, children and others such as teachers informed of best practices to stay safe online.
A final note
The Act is set to have a wide impact on social media services and organisations involving children. If you would like to discuss the impact of this new landscape for your organisation, please do not hesitate to contact us.
Jacquie Seemann | Partner | T +61 2 9020 5757 | M +61 412 104 851 | firstname.lastname@example.org
Lucienne Mummé | Partner | P +61 3 9641 8661 | M +61 417 146 063 | email@example.com
Chris Hartigan | Partner | T +61 3 9641 8745 | M +61 407 532 769 | firstname.lastname@example.org
Louise Russell | Partner | T +61 3 9641 8657 | M +61 411 2471 847 | email@example.com
Andrew Cardell-Ree | Partner | T +61 7 3338 7926 | M +61 448 985 291 | firstname.lastname@example.org
Karl Luke | Partner | T +61 8 8236 1280 | M +61 411 223 895 | email@example.com