At the end of November it passed the Australian federal parliament .
Details remain vague: We don’t have a full list of which platforms will fall under the legislation or what the ban will look like in practice. However, the government has signaled that trials of age insurance technologies will be central to its implementation approach.
Video games and online gaming platforms are not included in Australia’s social media ban. But we can anticipate how enforcing an online ban might (not) work by looking at China’s large-scale use of age verification technologies to limit youth video game consumption.
In China, strict rules limit children under 18 to just one hour of online gaming on specific days. This approach highlights significant challenges in expanding and enforcing such regulations, from ensuring compliance to safeguarding privacy.
“Spiritual Opium”
China is home to a large video game industry. Its tech giants, such as Tencent, are increasingly shaping the global gaming landscape. However, the issue of youth video game consumption is a much thornier issue in China.
The country has a deep cultural and social history of associating video games with addiction and harm, often calling them “spiritual opium.” This narrative frames gaming as a potential threat to young people’s physical, mental, and social well-being.
For many Chinese parents, this perception influences how they view their children’s play. They often see video games as a disruptive force that undermines academic success and social development.
Parental anxiety like this has paved the way for China to implement strict regulations on children’s online games. This approach has received widespread support from parents.
In 2019, China introduced a law to limit gaming for those under 18 to 90 minutes a day on weekdays and three hours on weekends. A “curfew” would prohibit gaming from 10pm to 8am.
A 2021 amendment further limited playing time to only 8pm to 9pm on Fridays, Saturdays, Sundays and public holidays.
In 2023, China expanded this regulatory framework beyond online gaming to include live streaming platforms, video sharing sites, and social media. It requires platforms to build and complete “addiction prevention systems.”
How is it applied?
Major gaming companies in China are implementing various compliance mechanisms to ensure compliance with these regulations. Some games incorporate age verification systems, requiring players to provide their real name and ID for age confirmation.
Some have even introduced facial recognition to ensure minors comply with the rules. This approach has raised privacy concerns.
In parallel, mobile device manufacturers, app stores and app developers have introduced “minor modes”. This is a feature in mobile games and apps that limits user access once a designated time limit has been reached (with an exception for apps pre-approved by parents).
A November 2022 report from the China Game Industry Research Institute, a state-affiliated organization, declared the success. More than 75 percent of minors reportedly spent less than three hours a week gaming, and officials said they curbed “internet addiction.”
However, these policies still face significant enforcement challenges and highlight a broader set of ethical issues.
Works?
Despite China’s strict rules, many young players find ways around them. A recent study revealed that more than 77% of minors surveyed have circumvented real name verification by registering accounts under the names of older relatives or friends.
Additionally, a growing black market for gaming accounts has emerged on Chinese trading platforms. These allow minors to rent or purchase accounts to circumvent the restrictions.
Reports of minors outsmarting facial recognition mechanisms, for example by using photos of older individuals, highlight the limitations of the technology-based application.
The regulation also introduced unintended risks for minors, including falling victim to scams involving gaming account sellers. In one reported case, nearly 3,000 minors were collectively scammed out of more than 86,000 yuan (about AU$18,500) while trying to circumvent the restrictions.
What can Australia learn from China?
The Chinese context shows that failure to meaningfully engage with young people’s motivations to consume media can end up driving them to evade restrictions.
A similar dynamic could easily emerge in Australia. This would undermine the impact of the government’s social media ban.
In the run-up to the law, we and many colleagues argued that blanket bans implemented through technological measures of dubious effectiveness risk being both invasive and ineffective. They may also increase online risks for young people.
Instead, Australian researchers and policymakers should work with platforms to build safer online environments. This can be done using tools such as age-appropriate content filters, parental controls and screen time management features, alongside broader safety-by-design approaches.
These measures empower families and allow young people to maintain digital social connections and engage in play. These activities are increasingly recognized as vital to children’s development.
Crucially, a more nuanced approach promotes healthier online habits without compromising young people’s privacy or freedom.
Ben Egliston has received funding from the Australian Research Council (DE240101275). It has previously received funding from Meta and TikTok.
Marcus Carter was awarded an Australian Research Council Future Fellowship (#220100076) on “The Monetization of Children in the Digital Games Industry.” He has previously received funding from Meta, TikTok and Snapchat and has consulted for Telstra. He is currently a board member and former president of the Digital Games Research Association of Australia.
Tianyi Zhangshao does not work for, consult with, own shares in, or receive funding from any company or organization that would benefit from this article, and has not disclosed any relevant affiliations beyond her academic appointment.