Brickbat: Don't Look Down Under
Australia now requires internet search engines to blur violent and sexual images to reduce the chances that minors can see them. By June, they must also determine whether account holders are at least 18 years old using a range of methods, such as digital ID, facial age estimation, credit card checks, parental verification, or AI inference. For minors and those who are not signed in, explicit material must be filtered from results, and autocomplete suggestions must avoid sexual or violent language. The law also requires content related to suicide or self-harm to be downranked, while crisis resources are placed high in the results. These obligations also apply to AI-generated search outputs, including systems like Google's Gemini. Opponents say these requirements raise concerns about accuracy and privacy, and even supporters admit they face challenges in defining and detecting harmful content without also blocking legitimate information.