Expert Reaction

EXPERT REACTION: Age verification extended to R-rated games and websites

Publicly released:
Australia; NSW; VIC; QLD; SA; WA
Image by StockSnap from Pixabay
Image by StockSnap from Pixabay

From Monday, Australian adults will be required to verify their ages to access R-rated video games and websites under the Age-Restricted Material Codes. Under the codes, search engines, social media platforms, 'adult' websites, app stores, gaming providers, and generative AI systems – including companion chatbots – must take 'meaningful steps' to prevent children from being exposed to age-inappropriate content. The codes are designed to limit children’s exposure to age-inappropriate content, including high-impact violence, self-harm material, and dangerous content such as suicide and disordered eating. Below, Australian experts comment on the changes.

Expert Reaction

These comments have been collated by the Science Media Centre to provide a variety of expert perspectives on this issue. Feel free to use these quotes in your stories. Views expressed are the personal opinions of the experts named. They do not represent the views of the SMC or any other organisation unless specifically stated.

Dr Mark Johnson is a Senior Lecturer in Digital Cultures from the Discipline of Media and Communications at the University of Sydney.

"Young people are always going to find ways around age gating to play games they want to play, and making more adult games seem forbidden is likely to both encourage young people towards them rather than the opposite, and make it harder to talk to parents or carers about their gaming.

Equally, game ratings are well disputed by both game researchers and game developers, and in Australia in particular, such ratings are often seen as being 'behind the times' and parochial and paternalistic.

Grading games in this way is hence doing little to protect young people, and indeed might have the opposite effect by making it harder for young people to begin to understand the adult world in age-appropriate ways during their teenage years.

Any age verification of this sort also has serious questions about the safety of user data, which is already being captured more and more by many blockbuster and mobile games (indie games remain far more concerned about user privacy).

The effects of this data capture by games or platforms have yet to be understood. Many games are also not even played on major platforms but on alternate sites like Itch.io, showing that a comprehensive application of such a policy is going to be impossible in practice, especially for more unusual, artistic, or provocative games being made by artists."

Last updated:  06 Mar 2026 2:37pm
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest None declared.

Dr David Tuffley is a Senior Lecturer in the School of Information and Communication Technology at Griffith University

"Australia's Age-Restricted Material Codes are an overdue step toward bringing our digital environment into line with those accepted protections that we've always given our children in the physical world.

We've never allowed newsagents to sell pornography to twelve-year-olds. We don't allow cinemas to admit children into R-rated movies. So the logic for extending those same protections online is not radical, more like common sense.

There's convincing evidence that links early exposure to high-impact violent and sexual content with harms to adolescents. This applies particularly to sexual attitudes and self-image.

However, age verification is not a silver bullet, and we need to be realistic about its limitations, for example, determined teenagers will find workarounds. That said, the absence of a perfect solution should not be an argument for no solution at all.

There now needs to be rigorous, independent evaluation of whether the codes are producing measurable reductions in children's exposure to harmful material. We should be willing to strengthen the framework where the evidence shows it is falling short."

Last updated:  06 Mar 2026 2:28pm
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest None declared.

Associate Professor Megan Lim is Deputy Program Director of Disease Elimination and Head of Young People's Health at the Burnet Institute

"Current age verification software and filtering software appear to be fairly ineffective at stopping young people from accessing pornography online – although they may help prevent accidental exposure. These efforts may be much more effective and preferable for younger children, particularly those who stumble across pornography accidentally. Most parents and young people in our research do support age restriction measures as a 'first line of defence' against exposure to unwanted content.

However, they had little trust in current technologies or in either companies or governments to implement these strategies. They reported significant concerns about privacy, security, and efficacy. Participants also reported a strong preference for strategies focused on education and open conversations, rather than restriction. We need further evidence and evaluation of the impacts, both positive and negative, that an age verification strategy will have."

Last updated:  06 Mar 2026 2:14pm
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest Megan has developed a non-commercial pornography education program called The Gist.

Professor Tama Leaver is a Professor of Internet Studies at Curtin University, and a Chief Investigator in the ARC Centre of Excellence for the Digital Child

"The public discussion of the social media ban for teens often centred on whether adults would be inconvenienced by an age verification barrier, but that generally didn't happen. On Monday, as adult Australians confront age verification barriers, or denial of access altogether, across a range of services and platforms housing 'adult' content, including games and some AI tools, many will pay attention for the first time to the new regulations shaping their online experiences and options.

Some will be fine with the idea that this is about ensuring young people only encounter appropriate content online. Reducing young people's exposure to harmful or unexpected content is generally agreed to be a worthwhile goal. Many people disagree that these restrictions justify the unintended consequences, especially in terms of risks to personal privacy.

A lot of adult Australians legally watch online pornography. On Monday, a lot of that content will either be inaccessible or behind a barrier that requires users identify themselves. The stigma attached to pornography will likely mean many adult Australians will find workarounds, such VPNs (your computer effectively pretending to be in a country that doesn't mandate restrictions), rather than identifying themselves. These same workarounds have been used by some under-16s to access social media post-ban. That should surprise no one.

Ideally, these new restrictions will prompt deep and ongoing conversations about what sort of content Australians believe is appropriate for children, teens and adults. No restrictions will be foolproof. And there will likely be unintended consequences that need to be understood, mapped and mitigated, at the very least."

Last updated:  06 Mar 2026 2:05pm
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest None declared.

Dr Rahat Masood is a Senior Lecturer in the School of Computer Science and Engineering at the University of New South Wales (UNSW Sydney)

“Age verification can help reduce children’s exposure to harmful online content, but we must also think carefully about privacy and security. Many of these age-verification systems require people to upload identity documents such as passports, driver’s licences, or facial images. This creates new privacy and security risks. We have already seen incidents where identity documents collected for verification were later exposed in data breaches.

When sensitive personal information is stored at large scale, it becomes a valuable target for attackers.

Therefore, strong safeguards must be in place to protect people’s identity data.
In many cases, platforms rely on third-party verification providers, so we also need to ensure those companies are trustworthy and properly audited.

If those third parties are compromised, large amounts of personal data could be exposed. Governments and regulators should ensure that these providers are independently audited and meet strong security and privacy standards.

Another challenge is that children do not use just one type of online service. They move across social media, gaming platforms, messaging apps, and increasingly generative-AI systems such as AI companions or chatbots. Safety protections need to work consistently across all of these environments. Otherwise, children may simply move to platforms where fewer safeguards exist.”

Last updated:  06 Mar 2026 2:03pm
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest None declared.

Dr Giselle Woodley is a sexologist, Lecturer and Research Fellow in the School of Arts and Humanities at Edith Cowan University

"While regulation is necessary, age verification needs to balance safety with privacy, security and human rights, including sexual expression. As per the age assurance trial, including the Government’s response to the roadmap for age verification, no measure has been found to be truly effective in balancing these components, posing threats to privacy and security for young people and adults alike.

Our research with young people exploring their perceptions of both sexual content online and age verification has indicated that they will find a workaround. They believe maturity is a more effective measure than age, and they find benefit in online content in terms of filling gaps in knowledge. Teens note that education is the best measure to reduce any unwanted impacts of sexual content online."

Last updated:  06 Mar 2026 2:02pm
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest Giselle has declared she is Chief Investigator on projects funded by the Australian Research Council.

Dr Belinda Barnet is a Senior Lecturer in Media at Swinburne University of Technology.

"I think it’s a good idea and not an overreach. We need to make these platforms take responsibility for what kids do (or don’t) see. Yes, it can probably be circumvented with a VPN, but that’s not a sufficient reason not to even try. 

I wish that the government specified third-party age verification rather than the pick-your-own-method being used, though. Platforms can choose to ask for identification documents, like a license, for example. I don’t like the idea that users may need to hand over identity documents to platforms. It doesn’t have to work that way. "

Last updated:  06 Mar 2026 2:01pm
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest None declared.

Professor Alan McKee is an expert on healthy sexual development and entertainment education for healthy sexuality in the Department of Media and Communications at the University of Sydney

"Our research shows that this legislation will be harmful to young people in Australia. It does not address the actual pathways that lead to unwanted exposure to sexually explicit material – for example, research suggests that a lot of this exposure involves peer group image-based abuse where extreme images are shared with friends or acquaintances in order to shock people.

The only solution to behaviour like that is education to challenge bullying. A key element of that is making sure that young people feel safe coming to trusted adults when they encounter material that upsets them.

The data shows that adolescents are already reluctant to come forward about things like that because trusted adults will often become angry at them because of the sensitivity of the topic, and the fact it forces the adults to talk about something they aren’t comfortable dealing with themselves.

This legislation now effectively criminalises adolescents who look at this material – which is going to make it even less likely that they will talk openly to trusted adults. We’re pushing this back into the dark. And that’s going to harm young people."

Last updated:  06 Mar 2026 1:56pm
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest None declared.

Dr Amelie Burgess is a Senior Lecturer in Marketing at Adelaide University

"These new Codes are an important step because they shift responsibility back onto the platforms and services that design, recommend, and profit from such content, rather than placing the burden on parents, schools, or young people to navigate it alone. That shift is particularly important for more vulnerable audiences, including children, because online harm is not simply the result of individual choices. In my research, I find that vulnerability online often emerges from the interaction between people’s ability to critically evaluate content and the social environments they are embedded in.

When platforms are designed around persuasive features such as algorithmic recommendations, frictionless sharing, and immersive generative AI systems, they can amplify exposure, normalise harmful material, and enable emerging risks such as digital sexual violence, including deepfake sexual imagery and sexually explicit AI interactions.

At the same time, implementation and recognition of vulnerability will be critical. Age is one important factor, but vulnerability online is shaped by a range of social, psychological, and structural conditions that affect how people engage with digital content. Age assurance, therefore, needs to be robust and privacy protective, and platforms must treat these protections as an ongoing responsibility rather than a one-off compliance task. As digital technologies continue to evolve and become more accessible, safeguards will also need to adapt to protect those most at risk while reducing opportunities for people to perpetrate harmful digital practices."

Last updated:  06 Mar 2026 1:30pm
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest None declared.

Dr Dana McKay is a senior lecturer in innovative interactive technologies at RMIT University

"In a world where half of all Australians aged 9-14 are regularly exposed to online pornography, age verification to access adult-oriented content is a natural extension of the technology tested during the social media ban, and could make online spaces safer for young people, who should be able to access the internet without stumbling upon adult content.

There are two ways to implement this directive: one is to ask people to verify their age as they access a space or item of adult content, just as (taking the metaphor used by the eSafety Commission) people are asked to age verify when they buy cigarettes or enter a bar.

The other is to provide persistent age verification attached to accounts or devices, where verification happens once and then never again. I would strongly argue for the first as a better standard: it mitigates against young people using the accounts of older people to access this material, it allows adults who don't want to access such material the option to opt out of having it algorithmically fed to them (by not providing age assurance), and it could also be privacy-protecting.

To have a persistent age verification mechanism means having some information about age stored in a system, which may include, for example, identity documents. If age verification is transitory, there is no need to store those documents. Done well, privacy-preserving age verification is likely to be hugely beneficial to internet users as a group, mitigating against inadvertent negative experiences not just for children, but adults too."

Last updated:  06 Mar 2026 1:28pm
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest None declared.

Prof Daniel Angus FQA is Professor of Digital Communication in the QUT School of Communication, Director of QUT’s Digital Media Research Centre, and Chief Investigator in the ARC Centre of Excellence for Automated Decision Making and Society

"Age-assurance requirements introduce new privacy questions because they potentially link identity verification with search behaviour. While the codes include privacy protections, many of these obligations are framed quite broadly and leave significant discretion to platforms and verification providers. When people search for sensitive information, such as mental health support, reproductive healthcare, or domestic violence services, the combination of identity checks and detailed search histories could create extremely sensitive data trails if not handled carefully.

Policies aimed at reducing harmful content often focus on restricting access, but exposure alone doesn't determine harm. What matters is the broader support environment around users: digital literacy, trusted adults, and access to reliable information. Effective online safety policy should strengthen those supports rather than relying solely on technical restrictions.

There's also a risk of technological determinism in these debates, the idea that technology alone can solve complex social problems. Age verification systems may change who can see certain material, but they don't address why harmful content exists or how it spreads. At the same time, these measures contribute to a broader shift away from the historically anonymous Internet toward one where identity verification becomes increasingly normalised."

Last updated:  06 Mar 2026 1:27pm
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest Daniel has declared he is a member of the Queensland Chief Scientist's Social Science Reference Group. No other conflicts to declare.

Attachments

Note: Not all attachments are visible to the general public. Research URLs will go live after the embargo ends.

Media Release eSafety Commissioner, Web page
Journal/
conference:
Organisation/s: Australian Science Media Centre
Funder: AusSMC.
Media Contact/s
Contact details are only visible to registered journalists.