Facebook whistleblower Frances Haugen told a Senate panel Tuesday that Congress must intervene to solve the “crisis” created by her former employer’s products.
The former Facebook product manager for civic misinformation told lawmakers that Facebook’s algorithm could steer young users from something relatively innocuous like healthy recipes to content promoting anorexia in a short period of time. Though she stopped short of accusing top executives of intentionally creating harmful products, she said that ultimately, CEO Mark Zuckerberg had to be responsible for the impact of his business.
Haugen, who unmasked herself Sunday as the source behind leaked documents at the core of a revealing Wall Street Journal series about Facebook, testified before the Senate Commerce subcommittee on consumer protection. Haugen told “60 Minutes” in an interview aired this weekend that the problems she saw at Facebook were worse than anywhere else she’d worked, which includes Google, Yelp and Pinterest. She told the news program that she copied tens of thousands of pages of internal research that she took with her when she left Facebook in May.
“I saw that Facebook repeatedly encountered conflicts between its own profits and our safety,” Haugen said in her written testimony. “Facebook consistently resolved those conflicts in favor of its own profits. The result has been a system that amplifies division, extremism and polarization — and undermining societies around the world.”
In her prepared remarks, Haugen said she believes she did the right thing in coming forward, but is aware Facebook could use its immense resources to “destroy” her.
“I came forward because I recognized a frightening truth: almost no one outside of Facebook knows what happens inside Facebook,” Haugen said in her written remarks. “The company’s leadership keeps vital information from the public, the U.S. government, its shareholders, and governments around the world.”
Haugen said a turning point that convinced her of the need to bring information outside of Facebook was when the company dissolved the civic integrity team after the 2020 U.S. election. Facebook said it would integrate those responsibilities into other parts of the company. But Haugen said that within six months of the reorganization, 75% of her “pod” of seven people whom had mostly come from civic integrity left for other parts of the company or left entirely.
“Six months after the reorganization, we had clearly lost faith that those changes were coming,” she said.
Though lawmakers have called on Facebook to end its plans to create an Instagram platform for kids (after it announced a temporary pause), Haugen told senators she would be “sincerely surprised” if Facebook stops working on the product.
“Facebook understands that if they want to continue to grow, they have to find new users,” Haugen said, adding that means ingraining kids with habits.
Along with her disclosures to the U.S. Senate and the Journal, Haugen also filed complaints with the Securities and Exchange Commission, claiming Facebook misled investors and advertisers by omitting or misrepresenting what it knew about how its platforms were being used, like to spread misinformation, and the measures it was taking to combat that.
Haugen said Tuesday that Facebook gave talking points to advertising staff following the January 6 insurrection at the U.S. Capitol assuring advertisers that Facebook was doing everything it could to make the platform safer, including by taking down all hate speech they find. Haugen said this was not true.
Facebook has accused the Journal of cherry-picking data, emphasizing that research showed that a majority of users surveyed in several cases found positive effects of using its products, even when a small percentage felt it made their negative feelings worse.
Haugen accused Facebook of “paying for its profits with our safety, including the safety of our children,” according to her written remarks.
Though she called on lawmakers to impose regulations on Facebook, she warned in her testimony that “Tweaks to outdated privacy protections or changes to Section 230 will not be sufficient,” referring to the legal shield that protects online platforms from liability for their users’ posts. She also said she believes a healthy social media platform is possible to achieve and that Facebook presents “false choices … between connecting with those you love online and your personal privacy.”
“The core of the issue is that no one can understand Facebook’s destructive choices better than Facebook, because only Facebook gets to look under the hood,” she said in her prepared testimony, saying transparency is the right first step.
She told lawmakers that she consistently saw teams at Facebook be understaffed, which prompted “an implicit discouragement from having better detection systems.” She said if Facebook had even a basic detector on the counter-espionage team which she worked, they would be able to pick up on many more cases than they already handled.
Similarly, she added that Facebook could do “substantially more” to detect children on their platform and should have to publish those processes for Congress. She said Facebook has the ability to detect more underaged kids on the platform even if they lie about their ages.
Opening the hearing Tuesday, Sen. Richard Blumenthal, D-Conn., the chairman of the subcommittee, called on Zuckerberg to come before the committee to explain the company’s actions. He called the company “morally bankrupt” for rejecting reforms offered by its own researchers.
Haugen said Zuckerberg’s unique position as CEO and founder with a majority of voting shares in the company makes him only accountable to himself.
There are “no similarly powerful companies that are as unilaterally controlled,” Haugen said.
Blumenthal said the disclosures by Haugen ushered in a “Big Tobacco moment,” a comparison Haugen echoed in her own testimony. Blumenthal recalled his own work suing tobacco companies as Connecticut’s attorney general, remembering a similar moment when enforcers learned those companies had conducted research showing the harmful effects of their products.
Sen. Roger Wicker, R-Miss., the chairman of the Commerce Committee, called the hearing, “part of the process of demystifying Big Tech.”
This story is developing.