Facebook comes under stark criticism at whistleblower hearing
Senators piled criticism onto Facebook Tuesday as a whistleblower accused the company of making choices that put profits over people.
Frances Haugen, a former Facebook product manager, testified in person before a full Senate Commerce subcommittee panel, urging Congress to hold the tech giant accountable for what she said was the harm it inflicted on children and its refusal to properly police its content.
“Facebook should not get a free pass on choices it makes to prioritize growth and virality and reactiveness over public safety. They shouldn’t get a free pass on that because they're paying for their profits right now with our safety,” she said.
It is the second in a series of hearings the committee has held since Haugen leaked explosive internal documents to The Wall Street Journal last month. But Haugen’s appearance is drawing more attention to the issues and concerns the company’s critics have long been raising.
Senators lauded Haugen for coming forward. Sen. Ed Markey (D-Mass.) called her a “21st century hero.” Sen. Amy Klobuchar (D-Minn.) said Haugen will be the “catalyst” for Congress to take actions on proposals that have been stalled for years.
“Thank you so much Ms. Haugen, for shedding a light on how Facebook time and time again has put profit over people. When their own research found that more than 13 percent of teen girls say that Instagram made their thoughts of suicide worse, what did they do? They proposed Instagram for kids,” Klobuchar said.
Facebook has announced a pause on the Instagram for kids platform after the released documents.
Sen. Marsha Blackburn (R-Tenn.), the ranking member of the committee, slammed Facebook for not taking enough of a stand against removing underaged accounts.
“While Facebook says that kids below 13 are not allowed on Facebook or Instagram, we know that they are there – Facebook said they deleted 600,000 accounts recently from kids under 13. How do you get that many underage accounts if you aren’t turning a blind eye to them in the first place?” she said.
Haugen’s testimony went beyond calling out the company’s impact on young users.
She said the company’s inability to catch offensive content on its platform is based on the company’s use of artificial intelligence which it has publicly boasted as a solution to combat issues of hate speech and misinformation.
“The reality is that we've seen from repeated documents within my disclosures, is that Facebook's AI systems only catch a very tiny minority of offending content,” Haugen said during a Senate Commerce consumer protection subcommittee hearing.
Haugen said in the “best case scenario” to catch “something like hate speech,” AI will reach about 10 to 20 percent.
In the case of children, ads promoting content such as drug paraphernalia, Facebook would likely never reach more than 10 to 20 percent of the ads if they “rely on computers and not humans.”
But Haugen said Facebook has a “deep focus on scale,” meaning taking action cheaply for a “huge number of people.”
“Which is part of why they rely on AI so much,” she said.
Haugen said the issue is also underscored by Facebook being “understaffed” to address concerns.
She said there was a “pattern of behavior: where issues were “so understaffed” that there was discouragement for having better direction.
In her role on Facebook’s counterespionage team, at any given time the team count only handle about a third of the cases it knew about.
“We know that if we built even a basic detector, we would likely have many more cases,” she said.