Australia bans social media for under 16s: Solution or stopgap?

Australia has become the first country to restrict access to platforms deemed harmful to young people’s health and well-being. However, the measure has been accompanied by significant gaps in its design and implementation.

Marta Barquier

Australia has become the first country in the world to set an age limit on access to the most popular platforms of our time. The government has banned platforms such as TikTok, Instagram, X, and Facebook, among others, for users under the age of 16. To access them, they must now overcome various age-verification and personal identification barriers.

“The Australian government’s stated objective is to be the first to take a stand and set limits for the major platforms,” explains Liliana Arroyo, researcher at Esade’s Institute for Social Innovation and former director general for Digital Society at the Government of Catalonia, “but it is a regulation with many gaps.”

This new legislation, which has generated significant domestic controversy and placed the country at the center of the public policy debate, aims to reduce screen time and exposure to content harmful to young people’s health. However, with the law already in force, the question is what comes next. Is it truly a solution to the problem, or merely a stopgap?

The origin of the problem

The platforms we consume on a daily basis are designed persuasively to capture our attention. “They are built around hidden patterns whose goal is to keep users on them for as long as possible,” says Arroyo.

The result is that minors often end up misusing them and consuming (unconsciously but excessively) content that is harmful to their health—from misogynistic and violent material to content that promotes eating disorders or suicide.

According to UNICEF’s report on the impact of technology on children and adolescents in Spain, the average age at which children gain access to a mobile phone is 10. The study shows how intensive use leads to higher anxiety levels, poorer quality of life, and greater exposure to harassment or cyberbullying. In fact, 58.4% of respondents reported having spoken with strangers on these platforms, 25.1% had received messages of a sexual nature, and nearly 9% had experienced pressure to send sexually explicit photos.

The problem caused by social media has been identified, but not how to prevent it

In theory, the Australian law not only points to a critical situation but also confronts it. In practice, however, is it sufficiently well developed?

The solution lies in design

The path chosen by the Australian government has been to tackle the problem at its root: banning access to the platforms most widely used by young people. Facebook, Instagram, Threads, TikTok, YouTube, Snapchat, X, Reddit, Twitch, and Kick now require facial, age, or voice verification systems, as well as other identification procedures, intended to ensure that users under 16 cannot access them.

But this system raises questions. “Does removing the platforms solve the problem, or would it make more sense to rethink their design to prevent the harm they cause?” asks Arroyo.

Since the law’s implementation, young people have taken matters into their own hands. The first and most obvious response has been to try to circumvent control barriers through VPNs—digital tools that allow users to connect to the internet via servers outside the country—by creating fake accounts, or using their parents’ accounts. The second has been to migrate to new social networks not covered by the law, which are less regulated and could have an even more detrimental effect on young people’s health.

“This legislation indicates what to do about the problem, but not how to do it.” In fact, although the law’s implementation was announced last year, the measures adopted to enforce it leave many open fronts. YouTube has been banned, but not YouTube Kids. Other gaming platforms such as Discord or Roblox—which is facing a criminal investigation in Florida for negligence regarding the presence of sexual predators—have also not been restricted. Are these less harmful than the others? Where is the line drawn?

Impact on competition

One effect of the law, Arroyo notes, has been to stimulate competition in a market dominated by large technology companies. The breaking of the platforms’ oligopoly has resulted in multimillion-dollar losses for these companies and has opened the door for others to emerge.

Beyond this, the regulation does not penalize young people who manage to bypass age verification, but rather the platforms that fail to comply with the law, which risk fines of up to 49.5 million Australian dollars.

Other countries are closely following Australia’s steps, including Spain. The new Organic Law on the Protection of Minors also sets 16 as the age at which restrictions apply and will require age-verification systems. The aim is to link this proposal to the Digital Wallet Beta project, a mobile application that verifies users’ age and restricts minors’ access to adult content, which passed a significant level of validation at the European level last October.

What social media do we want?

While the impact of this law will be assessed in the long term, its entry into force has already produced visible consequences.

The main advantage is that “it has publicly identified which types of platforms are harmful to people’s well-being,” explains Arroyo, “and it has created a situation in which change can be promoted—not only in the design of social networks, but also in the legislation of other countries.” With Australia in the spotlight, it is only a matter of time before other governments consider similar measures.

The goal should not be to eliminate social media, but to transform it into safe spaces

“The initiative, however, has set a negative precedent. It has been designed without young people—the main stakeholders—and without a roadmap defining what kind of social networks we want,” Arroyo argues.

Young people’s reaction to this new law has been marked by frustration and a sense of loss. “They talk about emptiness because they have lost access to social networks, but also to their communities, spaces, and connections—many of which they had built over time and now do not know if they will be able to recover.” This new situation affects not only minors, but also their families, who have become responsible for supporting them through this transition.

New possibilities

As with any major change, a period of uncertainty and new possibilities now begins. The repercussions of this law will not be immediate, and its effects will become apparent over time.

The objective now “should focus on identifying the indicators that make digital innovation responsible,” concludes Arroyo. Perhaps the real question is what we truly want from social media—its content and the way it is presented. And perhaps the Australian experience will allow other governments to avoid repeating the same mistakes.

Addressing this change together with young people, creating healthy and transparent digital spaces, and transforming the design of social networks to ensure user well-being could be steps toward a more inclusive and people-centered legislative approach.

All written content is licensed under a Creative Commons Attribution 4.0 International license.