The Australian Online Safety Act
OpenAI is committed to complying with the Australian Online Safety Act(opens in a new window). The Australian Online Safety Act imposes duties on relevant service providers relating to illegal and restricted online content. We work hard to promote responsible use of our products and keep our users safe.
How we protect users
We aim to balance delivering helpful and accessible information to all users while mitigating the risks of online harm. We use a range of procedures and tools to protect users from illegal and harmful content.
Illegal content
This includes measures to prevent, detect, respond to, and take enforcement action against illegal content, such as terrorism content, child sexual exploitation and abuse content, extreme violence content and other illegal content.
We aim to review and remove illegal content as swiftly as possible when we become aware of it, whether via our own proactive detection methods, or from reports from third parties, including our users. This helps to prevent users encountering such content and minimises the length of time illegal content is present on the service.
More information on our moderation and enforcement processes is set out in our Transparency & Content Moderation page. Please see our Reporting Content page(opens in a new window), for details about how you can report content, including illegal content, on our services.
Harmful content
We aim to provide a safe online experience for all our users, and take action to protect all users (including users under 18) from harmful content that violates our policies. This includes content the Australian Online Safety Act recognises as illegal and restricted online content.
When we become aware of such content, we take appropriate action, balancing the importance of protecting our users and ensuring they have access to information.
If you think you have encountered harmful content on our services, please report it to us and we will investigate. We are committed to user safety and will take action to help prevent violative URLs or responses being provided to other users.
There are also third party support resources recommended by eSafety(opens in a new window) available to you, if you or someone you know has come across distressing content or experienced serious harm online. You can also get help and support from one of these counselling services:
- Lifeline Australia(opens in a new window): 13 11 14
- Kids Helpline(opens in a new window): 1800 55 1800
Our use of proactive technology
We use proactive technology to help prevent users generating, encountering and accessing illegal and harmful content on our services. This includes the use of model training and policies, content classifiers, reasoning models, hash-matching, blocklists, and other automated systems to identify content that may violate our terms or policies.
More information, including about our moderation and enforcement processes is set out in our Transparency & Content Moderation page.
Our compliance with the Australian Online Safety Act
If you’re in Australia and think OpenAI isn’t complying with its obligations under the Australian Online Safety Act or has used proactive technology to moderate content in a way that is not compliant with our terms, you can report this to us via our Australian Online Safety Act Reporting Form.
We will acknowledge and review your report and consider how your feedback may help us improve our processes. We aim to review reports within 10 business days although this process may take longer for more complex reports.
The eSafety Commissioner is Australia’s independent regulator for online safety. eSafety investigates and helps you to remove illegal and harmful content such as cyberbullying of children, adult cyber abuse, and intimate images or videos shared without consent.
You can directly report(opens in a new window) cyberbullying, adult cyber abuse, image-based abuse, illegal and restricted online content, or your concerns about OpenAI’s compliance with the Australian Online Safety Act to eSafety.
Appealing content moderation decisions
If we take enforcement action based on your content or activity (including following our use of proactive technology), and you think we have a mistake, you can report this to us and appeal our decision. Further information on how to appeal is set out on our Transparency & Content Moderation page.
We aim to review appeals promptly, though more complex cases may take longer.
* Under the Australian Online Safety Act, illegal and restricted online content ranges from seriously harmful materials such as images and videos showing or encouraging the sexual abuse or exploitation of children, terrorist acts and other types of violent crimes or extreme violence (including murder, attempted murder, rape, torture, violent kidnapping) and content that shows self-harm or suicide or explains how to do it, through to content which should be not be accessed by children, such as simulated sexual activity, detailed nudity, high impact violence, and eating disorder content.
** This includes our duties relating to: illegal and restricted online content; content reporting processes; freedom of expression or privacy; compliance with the registered Industry Online Safety Codes and Standards(opens in a new window); or if you think we have used proactive technology to moderate content in a way that is not compliant with our terms.