

Jul 7, 2025
Australia has significantly broadened its digital surveillance regime with the formal introduction of three new online safety codes under the Online Safety Act, effective June 27. These sweeping regulations go far beyond mandating age checks for adult websites—they place substantial compliance obligations on a wide range of tech companies, including search engines, hosting platforms, and internet service providers (ISPs). Companies that fail to meet the requirements could face fines up to AUD 49.5 million (roughly $32.5 million USD). The codes target access to content categorized as Class 1C (fetish-based pornography) and Class 2 (sexually explicit material and high-impact themes, including violence, drug use, and social issues such as suicide or racism). Critics argue that these vague and expansive definitions could open the door to broad censorship.
Each section of the new policy addresses a different layer of the internet ecosystem. Hosting providers must now implement six compliance measures to prevent children from accessing restricted content when major platform changes occur. ISPs are required to offer filtering tools, safety guidance, and must block material the eSafety Commissioner deems to promote abhorrent violent acts—powers that have already been used, such as in blocking a stabbing video on X (formerly Twitter). Meanwhile, search engines must implement age verification for users within six months, filter out explicit content by default, and add features like parental controls and crisis resources. These mandates also apply to AI-powered search functions. While the Australian government claims these steps are meant to protect children, privacy advocates warn they mark a dangerous expansion of state control over the internet, raising serious concerns about surveillance, digital ID enforcement, and the erosion of free access to information online.
Stay Awake. Keep Watch.
SOURCE: Reclaim the Net