A growing number of social media platforms are now requiring age verification, with users needing to provide an ID or undergo a facial scan to access certain features and content. This trend is being driven by calls for stronger child safety measures online, despite concerns about privacy, security, and censorship.
In the US, lawmakers are pushing for bills such as the App Store Accountability Act and Parents Over Platforms Act, which would require app stores to verify users’ ages. Discord has delayed its plans to roll out age verification globally after facing user backlash, but has not completely abandoned the idea. Other platforms, including ChatGPT and Google, are using AI models to identify and lock down accounts suspected of being underage until the user can provide proof of age. Roblox now requires an age check for users who want to chat, and Reddit is rolling out age verification in the UK. Meanwhile, Apple is taking steps to comply with age verification laws, and Microsoft is introducing age verification for Xbox users in the UK.
The impact of age verification on the internet is likely to be significant, with many platforms and apps needing to adapt to new laws and regulations. As Google has open sourced its privacy-focused age verification technology, it is likely that other companies will follow suit. The EU is also testing a prototype age verification app, which is set to launch in July. However, there are concerns about the potential risks to privacy and security, and the effectiveness of age verification in preventing underage access to certain content. As the situation continues to evolve, it is likely that we will see further developments and updates on age verification across the internet.

















Leave a Reply