Meta's Platforms Accused of Facilitating Child Sexual Exploitation
Internal documents reveal an estimated 100,000 children receive explicit content daily on Facebook and Instagram.
- Internal documents from Meta, the parent company of Facebook and Instagram, reveal that an estimated 100,000 children receive sexually explicit content daily on these platforms, including pictures of adult genitalia.
- Meta's 'People You May Know' algorithm, which recommends connections to users, has been identified as a primary connector of children to potential predators.
- Despite internal concerns raised by Meta employees about child exploitation, the company allegedly failed to prioritize implementing safeguards or outright blocked child safety features because they weren’t profitable.
- An incident in 2020 involving an Apple executive's 12-year-old child being solicited on Facebook nearly resulted in Meta's expulsion from the Apple App Store.
- New Mexico's Attorney General Raul Torrez has sued Meta, alleging that the company failed to protect young users from exposure to child sexual abuse material and allowed adults to solicit explicit imagery from them.