Instagram will start sending alerts to parents if their teenagers repeatedly search for terms related to self-harm or…

Instagram will introduce a new feature that alerts parents when their teenage children repeatedly search for terms related to self-harm or suicide on the platform. This update is set to roll out to teen accounts with parental supervision protections in the US, UK, Australia, and Canada starting next week.

The new feature will send parents an alert when their child repeatedly tries to search for terms clearly associated with suicide or self-harm within a short period of time. This feature is only available for parents and teens who have opted-in to supervision. Meta also plans to introduce a similar alert system for its AI chatbots later this year. The feature is expected to expand to other regions later this year, although a specific timeline has not been announced.

The introduction of this feature highlights Meta‘s efforts to improve safety on its platforms, particularly for vulnerable users. As the feature rolls out, it will be important to monitor its effectiveness and potential impact on users. With the involvement of Meta and potentially other companies like Nvidia, Ring, and OpenAI in the future, the development of safety features like this one may continue to evolve and expand to other areas of online safety.

Leave a Reply

Your email address will not be published. Required fields are marked *

Recent Posts

AliExpress WW