Instagram is stepping up its game with AI technology to address a pressing issue: underage users lying about their age. As the platform, which is part of Meta, faces increasing scrutiny over its impact on young users, it is implementing an innovative tool known as the "adult classifier." This new tool will analyse various aspects of users' accounts, including their followers, the content they interact with, and even birthday messages from friends, to determine their age accurately. If the system suspects a user is under 18, they will automatically be switched to a more restrictive account type, regardless of what they claim during the Instagram signup process.
With over 1.1 million reports of underage users since early 2019, it's clear that Meta needs to take action. Reports have indicated that many teenagers, particularly girls, experience anxiety and mental health issues linked to their use of the platform. This scrutiny has led to various lawsuits against Meta, including one from 33 U.S. states, accusing the company of knowingly designing features that could harm young users and failing to comply with the Children’s Online Privacy Protection Act (COPPA).
The new AI-generated Instagram content verification methods will help the platform enforce its age restrictions more effectively. If a user changes their birth date, they will need to verify their age with an ID. Alternatively, a method called "social vouching" will allow three followers aged 18 and above to confirm that the user is indeed of age. Additionally, Meta has partnered with tech firm Yoti, enabling users to verify their age through video selfies.
Read More: Instagram Trials AI Face Scanning Tool for Age Verification
With the introduction of teen accounts in September 2024, users under 18 will have their accounts set to private by default and will face limitations on messaging and interactions. Parents will also have the option to manage their child's account settings. For those aged 16 and 17, there's more flexibility to adjust privacy settings, but users under 16 will require parental consent to change messaging restrictions. This approach reflects Instagram's commitment to creating a safer environment for younger users while acknowledging the challenges of verifying ages accurately.
The application of AI in social media is becoming increasingly crucial as platforms like Instagram work to enhance user safety. The adult classifier is designed to address the concerns raised by both lawmakers and users about the negative impact of social media on mental health. By implementing these changes, Instagram aims to cultivate a healthier environment that prioritizes the well-being of its younger audience.
Read the relevant article to learn more about this topic.
By clicking "Accept", you agree to the storing of cookies on your device to enhance site navigation, analyze site usage and assist in improving your experience.