Meta ‘Cannot Be Trusted With Our Children’ Whistleblower Says

A building in the evening shows the Meta logo

A Meta whistleblower testified before Congress Tuesday, raising concerns regarding safety issues at the tech conglomerate.

Arturo Béjar, a former director of engineering for Protect and Care at Facebook, spoke before the Subcommittee on Privacy, Technology and the Law and claimed he sent CEO Mark Zuckerberg and other higher ups an email on the matter but felt sufficient action was not taken, according to Engadget.

“Meta continues to publicly misrepresent the level and frequency of harm that users, especially children, experience on the platform,” Béjar said before the subcommittee Tuesday, according to Engadget. “And they have yet to establish a goal for actually reducing those harms and protecting children. It’s time that the public and parents understand the true level of harm posed by these ‘products’ and it’s time that young users have the tools to report and suppress online abuse.”

Béjar, who worked on a team focused on building “bullying tools for teens, suicide prevention, child safety, and other difficult moments that people go through where Facebook could help” as a director of engineering at Facebook from 2009 through 2015, according to his LinkedIn profile, addressed matters regarding young users on the platform in his email to Meta executives. Béjar sent that email in 2021, the same day, Engadget points out, Frances Haugen detailed to Congress how Meta could fix its safety issues. In his email, Béjar detailed the harassment his daughter experienced on Meta-owned Instagram, unprompted attacks of misogyny, that were found not to be in violation of the social media site’s policies. Béjar said that though some offered support or agreed to meet with him, the former employee does not recall receiving a response from Zuckerberg.

Thought not listed on his LinkedIn profile, Béjar told Congress he returned to Meta in 2019 to join its Instagram wellbeing team, where he reportedly found the prior work of he and his colleagues was erased.

“The tools we had built for teenagers to get support when they were getting bullied or harassed were no longer available to them. People at the company had little or no memory of the lessons we had learned earlier,” Béjar said Tuesday.

The former Meta employee laid out practices he felt would help, including the ability to turn off “addictive features and algorithm-based recommendations,” increased data protection for younger users, soliciting feedback from users who feel they’ve been harmed on the platforms, and allowing users to flag and filter any sexually explicit messages even if they do not violate Meta’s policies. He claimed these changes would not require significant investment or impact to revenue.

“It’s time for Congress to act,” Béjar testified. “The evidence, I believe, is overwhelming.”


Image credits: Header photo licensed via Depositphotos.

Discussion