In the new program, called Lantern, the big tech companies will share signals of activity that violate their policies on child exploitation so that platforms can move more quickly to detect, take down and report problematic content.
Big tech companies, including Facebook-owner Meta and Google, said Tuesday they would team up in a new program to fight online child sexual abuse or exploitation.
Child victims of abuse online are a hot-button issue for regulators and tech companies are eager to show they are taking adequate measures to protect kids and teens.
In the new program, called Lantern, the big tech companies will share signals of activity that violate their policies on child exploitation so that platforms can move more quickly to detect, take down and report problematic content.
Signals could be email addresses, certain hashtags or key words that are either used to groom young people into being abused or buy and sell material involving child abuse and exploitation.
“Until now, no consistent procedure existed for companies to collaborate against predatory actors evading detection across services,” said Sean Litton, executive director of the Tech Coalition, which brings together tech companies on the issue.
“Lantern fills this gap and shines a light on cross-platform attempts at online child sexual exploitation and abuse, helping to make the internet safer for kids,” Litton added.
Other platforms in the Tech Coalition include Snap, Discord and Mega, a privacy focused platform from New Zealand.
Tech Coalition said that a pilot of the program saw Meta remove more than 10,000 Facebook profiles, pages and Instagram accounts after data was shared by Mega.
Meta reported the accounts involved to the US-based National Center for Missing & Exploited Children and shared findings with other platforms for their own investigations.
“Predators don’t limit their attempts to harm children to individual platforms,” said Antigone Davis, Global Head of Safety at Meta.
“The technology industry needs to work together to stop predators and protect children on the many apps and websites they use,” she added.
The announcement of Lantern came on the same day that a former Meta senior engineer told a Senate hearing in Washington that top executives, including Mark Zuckerberg, ignored his warnings that teens were unsafe on the company’s platforms.
Arturo Bejar told lawmakers that in an internal survey of 13-15 year olds on Instagram, 13 percent of respondents had received unwanted sexual advances on Instagram in the last seven days.
“Meta knows the harm that kids experience on their platform and executives know that their measures fail to address it,” Bejar said.
This was a fantastic read. The analysis was spot-on. Interested in more? Click on my nickname for more engaging discussions!