This article was last updated 3 years ago

Former Facebook employee and whistleblower Frances Haugen testifies during a Senate Committee on Commerce, Science, and Transportation hearing entitled ‘Protecting Kids Online: Testimony from a Facebook Whistleblower’ on Capitol Hill, in Washington, U.S., October 5, 2021. Jabin Botsford/Pool via REUTERS

Facebook is having a rough week. Soon after its major social media platforms – Facebook, Instagram, and WhatsApp – suffered an outage due to a faulty configuration change, the company has been accused of making “disastrous” choices with regards to children, public safety, privacy, and democracy by Frances Haugen, the whistleblower who leaked several internal documents to the Wall Street Journal.

Haugen, who happens to be a former project manager on civic misinformation at Facebook, testified before the Senate Committee on Commerce, Science, and Transportation on Tuesday and said that the intervention of Congress was necessary since Facebook cannot solve the problems on its own. According to the tens of thousands of internal documents in Haugen’s possession, Facebook had displayed a cavalier attitude regarding the safety of users and instead pushed for higher profits.

“However, the choices being made inside Facebook are disastrous — for our children, for our public safety, for our privacy, and for our democracy — and that is why we must demand Facebook make changes,” she said.

While Haugen did not accuse any of the top Facebook execs of intentionally creating harmful products, she said that Facebook CEO Mark Zuckerberg had to be responsible for the impact of his business. She also spoke on Facebook’s algorithm, which she feels is dangerous and could steer the younger users from relatively innocuous content to content that promotes toxic and violent topics. Facebook’s current algorithm rewards posts that generate meaningful social interactions (MSIs). According to the leaked documents, this algorithm gave rise to “unhealthy side effects on important slices of public content, such as politics and news.”

Haugen, who is an algorithm specialist herself, proposed a solution – instead of prioritizing content that elicits stronger reactions from users, Facebook should create a chronological feed of posts. This will help the company deliver safer content to its users.

In the TV show “60 Minutes,” Haugen said that Facebook implemented safeguards to reduce misinformation ahead of the 2020 U.S. presidential election, but had turned them off after the election. She said that Facebook emphasized a false choice – they could either use their volatile algorithms and continue their rapid growth, or they can prioritize user safety and decline. Haugen proposed a solution – the adoption of safety measures like government oversight.

Speaking of oversight, Haugen mulled over the establishment of transparency (she said that it was a critical starting point for effective regulation) and sharing information with oversight bodies such as the Congress and working with academics to make sure they have the information needed to conduct research and the implementation of “soft interventions” which had been identified earlier.

She added that she “strongly encourages” the reformation of Section 230 of the United States Communications Decency Act, feeling that it should not include decisions about algorithms and make companies liable to face legal consequences if their algorithms are found to cause harm.

Haugen also spoke on the role of Facebook in international security – its moderation practices are seemingly not enough for stopping its platform’s use for crimes and terrorism in the international arena.