Engadget: “Last December the state of New Mexico sued Meta for failing to protect children, claiming that Facebook and Instagram algorithms recommended sexual content to minors. Now, an unredacted internal Meta presentation has been revealed, with the company’s own employees estimating that 100,000 child users were harassed daily, The Wall Street Journal reported. According to a 2021 internal document, Facebook’s “People You May Know” (PYMK) algorithm was singled out as a primary connector of children to predators. When employees reported those findings to Meta executives, they reportedly rejected recommendations that the algorithm be redesigned to stop recommending adults to minors. The feature was responsible for 75 percent of all inappropriate adult-minor contact, according to one employee. “How on earth have we not just turned off PYMK between adults and children?” another employee said. “It’s really, really upsetting,” added another. The issues were particularly insidious on Instagram, according to an internal 2020 memo, with “sex talk” 38 times more prevalent on that platform than Facebook Messenger in the US. In one case an Apple executive reported that his 12-year-old child was solicited on Instagram. “This is the kind of thing that pisses Apple off to the extend of threat[en]ing to remove us from the App Store,” said an employee charged with addressing the issue…”
See also TechCrunch – Unredacted Meta documents reveal ‘historical reluctance’ to protect children – “…In a statement to TechCrunch, New Mexico Attorney General Raúl Torrez said that Meta and Zuckerberg enabled child predators to sexually exploit children. He recently raised concerns over Meta enabling end-to-end encryption protection for Messenger, which began rolling out last month. In a separate filing, Torrez pointed out that Meta failed to address child exploitation on its platform, and that encryption without proper safeguards would further endanger minors.”
Sorry, comments are closed for this post.