Getty Images

Facebook whistleblower slams company's practices

Facebook whistleblower Frances Haugen condemned the company's practice of shielding internal research and called upon Congress to take action.

Frances Haugen, a former product manager at Facebook, testified during a Senate committee hearing Tuesday that the company has misled users about the efficacy of its content ranking algorithms and made "disastrous" choices for user privacy and democracy.

Last month, Haugen became a whistleblower after she released Facebook's internal research on its platforms' negative impact on teen users -- particularly the harmful effect Instagram has on teen girls. Testifying before the U.S. Senate Committee on Commerce, Science and Transportation, Haugen claimed Facebook kept damaging data from the public and should be held accountable by Congress.

As long as Facebook fails to disclose internal product research on its products, including Instagram, Haugen said the company is escaping accountability for its actions.

During the hearing, Haugen detailed the harms rendered by engagement-based ranking algorithms that dictate what a user sees, which she said heighten violent discord and extreme content.

"Today, Facebook shapes our perception of the world by choosing the information we see," she said.
"Even those who don't use Facebook are impacted by the majority who do. A company with such frightening influence over so many people, over their deepest thoughts, feelings and behaviors, needs real oversight. But Facebook's closed design means it has no real oversight."

Facebook whistleblower says company hiding data

Facebook "hides behind walls that keeps researchers and regulators from understanding the true dynamics of their system," Haugen said.

"Facebook will tell you privacy means they can't give you data; this is not true," she said. "When tobacco companies claimed that filtered cigarettes were safer for consumers, scientists could independently invalidate these marketing messages and confirm that in fact they posed a greater threat to human health. The public cannot do the same with Facebook. We are given no other option than to take their marketing messages on blind faith."

Senators applauded Haugen's openness and willingness to discuss her insights and called her a "hero" for coming forward and testifying. The committee has already held several hearings critical of social media platforms, including one last week during which Facebook's global head of safety testified. Haugen's testimony served to affirm what some already suspected.

"You're armed with documents and evidence … about how Facebook has put profits ahead of people," said Sen. Richard Blumenthal, D-Conn., chair of the Subcommittee on Consumer Protection, Public Safety and Data Security, to Haugen during the hearing. "Among other revelations, the information you have provided to Congress is powerful proof that Facebook knew its products were harming teenagers. Facebook exploited teens using powerful algorithms that amplified their insecurities."

Haugen said that not only does Facebook hide most of its own data, but the leaked internal research also proved that "when Facebook is directly asked questions such as 'how do you impact the health and safety of our children,' they choose to mislead and misdirect."

"I believe it is vitally important for our democracy that we establish mechanisms where Facebook's internal research must be disclosed to the public on a regular basis, and that we need to have privacy-sensitive data sets that allow independent researchers to confirm whether or not Facebook's marketing messages are actually true," she said.

Haugen said on top of a lack of transparency when it comes to data and research, one of the biggest problems with the Facebook and Instagram platforms are the engagement-based ranking algorithms it relies on to show users content. She said the company knows engagement-based ranking algorithms cause "amplification problems" that can lead a user searching for healthy diet information, for example, to anorexia content.

Haugen said Facebook's internal research says the algorithms cannot adequately identify dangerous content. Engagement-based ranking algorithms amplify negative content, as well as fan violent rhetoric and ethnic violence in places like Ethiopia, she said.

Fixing the problem

When asked what changes she would institute at Facebook, Haugen said she would take several steps.

Haugen said she would establish a policy for sharing information and research from inside the company with appropriate oversight bodies such as Congress; actively engage with academics to ensure Facebook's marketing messages are true; and implement interventions such as requiring a user to click on a link before resharing it. Companies like Twitter have found that requiring users to click the link before sharing reduces misinformation spreading.

While Haugen doesn't support breaking up social media giants like Facebook, she said she strongly encourages modifying Section 230 of the Communications Decency Act that protects social media companies from liability when users post on their platforms to include exempting company decisions about algorithm use. Although Facebook has less control over what a user posts, she said the company has 100% control over its algorithms and should be held accountable for the algorithms it uses to prioritize content and products.

Haugen also supported creation of a dedicated oversight body within the federal government.

Until incentives change at Facebook, we should not expect Facebook to change. We need action from Congress.
Frances HaugenFormer product manager, Facebook

"Until incentives change at Facebook, we should not expect Facebook to change," she said. "We need action from Congress."

Blumenthal said he supports reforming Section 230, as well as requiring disclosures of internal research and allowing independent review of social media platforms and plans to pursue those courses of action. He also called on the U.S. Securities and Exchange Commission, as well as the Federal Trade Commission, to investigate Haugen's claims about Facebook's conduct.

"Facebook appears to have misled the public and investors, and if that's correct it ought to face real penalty as a result of that misleading and deceptive misrepresentation," he said.

Mark Zuckerberg responds

Facebook CEO Mark Zuckerberg shared a lengthy Facebook post following Haugen's testimony, in which he decried the mischaracterization of Facebook's internal research and transparency efforts.

Zuckerberg said if Facebook wanted to ignore problems, it wouldn't employ an "industry-leading research program" to understand issues around the platform, as well as establish an "industry-leading standard for transparency and reporting on what we're doing."

Zuckerberg said it's "disheartening" to see the company's internal research taken out of context and used to construct a narrative that the company doesn't care about issues facing the platform. He said it sets a bad precedent.

"If we attack organizations making an effort to study their impact on the world, we're effectively sending the message that it's safer not to look at all, in case you find something that could be held against you," Zuckerberg wrote.

Facebook has advocated for updated internet regulations for years, Zuckerberg said, noting that the right body to "assess tradeoffs between social equities" and answer questions regarding the right age for teens to start using the internet, how internet services should verify users' ages, and how companies should balance teen privacy and parental insight into their activity, is Congress.

During the hearing, Sen. Ed Markey, D-Mass., shared a message for Zuckerberg.

"Your time of invading our privacy, promoting toxic content and preying on children and teens is over, Congress will be taking action," Markey said. "You can work with us or not work with us, but we will not allow your company to harm our children and our families and our democracy any longer."

Makenzie Holland is a news writer covering big tech and federal regulation. Prior to joining TechTarget, she was a general reporter for the Wilmington StarNews and a crime and education reporter at the Wabash Plain Dealer.

Dig Deeper on CIO strategy

Cloud Computing
Mobile Computing
Data Center
Sustainability
and ESG
Close