Meta, the owner of Instagram, is introducing a new safety feature to help protect teenagers from seeing unwanted nude photos in their direct messages.
The feature will automatically blur out nude images if the recipient is identified as a teenager.
Meta said this safety measure is aimed at addressing three main problems. First, teenagers sometimes receive nude photos that they didn’t ask for and don’t want to see.
Second, sending nude photos of teenagers, even if it’s the teenagers themselves sending them, can be against the law. And third, there are scams where teenagers, especially boys, are tricked into sending explicit photos and then blackmailed.
The tech giant is under mounting pressure in the United States and Europe over allegations that its apps were addictive and have fuelled mental health issues among young people.
Meta said the protection feature for Instagram’s direct messages would use on-device machine learning to analyse whether an image sent through the service contains nudity.
It said it will use this technology to detect nude photos in direct messages. If the recipient is flagged as a teenager based on their birth date, the photo will be blurred and a warning message will appear.
Attorneys general of 33 U.S. states, including California and New York, sued the company in October, saying it repeatedly misled the public about the dangers of its platforms.
In Europe, the European Commission has sought information on how Meta protects children from illegal and harmful content.
ALSO READ: Sack of Ganduje, civilian coup — APC Vanguard