Social media ‘ban’ questions answered

Written and contributed to The Smashed Avocado by Haylie.

  • To quote the eSafety Commissioner, ‘The aim of the new law is to protect young people from design features that encourage you to spend too much time on screens and show you content that can be harmful to your health and wellbeing.’

    It is also aimed to reduce other negative impacts social media can have, such as peer pressure, cyberbullying, and exposure to harmful content.

  • Not all platforms will ban under 16’s.

    The list is constantly changing, but can be found on the eSafety Commissioner’s website.

  • The ban will come into place on 10 December 2025.

    On this date, affected platforms must start using reasonable measures to get rid of accounts held by those under the age of 16, along with making sure to confirm the age of users making a new account to ensure they aren’t under the age of 16.

  • Affected platforms will have to take reasonable steps to identify accounts of underage users and either delete or deactivate them.

    You can expect this to happen anytime on/after 10 December.

  • ‍ There are a few things you can do to prepare for the ban:

    • Save any content from your accounts that you don’t want to lose.

    • Find other ways to stay connected, such as making a group chat on an approved platform.

    • Check restrictions to find out how you’ll be affected.

    • Stay up to date with updates on the ‘ban’ from the eSafety Commissioner for verified information on any changes.

  • Platforms will have to use some form of age detection software, although what type they use is mostly up to them.

  • It may be an option to use government ID (drivers licence, proof of age card) on some platforms, but the guidelines state this cannot be the only option for proving your age.

    If you feel comfortable showing ID, you likely will be able to, but there will be other options too.

  • It is likely, at least for the initial period, that some people will find ways around the new rules whilst platforms begin to adhere to the ban.

    Because of this, there are measures platforms are expected to take to identify and remove users that still slip in.

    These include:

    • Not relying on self-declared age of users.

    • Comparing social media usage patterns (e.g., time online) to the ‘typical’ use of underage users, and escalating to further checks if a user seems to be under 16.

    • Monitoring sudden account detail changes in user’s age or name.

    • Monitoring sudden changes in location from VPNs or changes in location settings.

  • There will be no penalty for any under 16’s (or their parents) still using their account on a restricted platform.

    If the platform itself is found to not be removing underage users they can be fined, but this cannot extend to users who find their way on.

  • No, you will not get in trouble for reporting it, neither will your parents.

    It is on the platform itself to ensure users are of age, and to make sure all users (regardless of age) are safe on their site.

  • That will depend on how each platform decides to deal with accounts they remove.

    While it’s recommended platforms allow you to re-activate your account, there is no requirement for them to keep your account.

  • There are support services that you can contact, for free and without a referral, including:

    • Kids Helpline 1800 55 1800 & webchat offers support for people aged 5–25. They also have a peer-support platform, MyCircle for 12–25s.

    • 13 YARN (13 19 76) offers crisis support for First Nations People 24/7

    • Lifeline 13 11 14 & webchat offers crisis support 24/7

  • The eSafety Commissioner has developed a ‘Get-ready guide for under-16’s’, which is a great place to start!

 
Next
Next

Another social media 'ban' update