Select Page

Facebook’s Products Harm Children

October 4, 2021 — Sub-Committee on Consumer Protection, Product Safety, and Data Security, US Senate Committee on Commerce, Science and Transportation, Washington DC

 

Chairman Blumenthal, Ranking Member Blackburn, and Members of the Subcommittee. Thank you for the opportunity to appear before you and for your interest in confronting one of the most urgent threats to the American people, to our children and our country’s well-being, as well as to people and nations across the globe.

My name is Frances Haugen. I used to work at Facebook and joined because I think Facebook has the potential to bring out the best in us. But I am here today because I believe that Facebook’s products harm children, stoke division, weaken our democracy and much more. The company’s leadership knows ways to make Facebook and Instagram safer and won’t make the necessary changes because they have put their immense profits before people. Congressional action is needed. They cannot solve this crisis without your help.

I believe that social media has the potential to enrich our lives and our society. We can have social media we enjoy — one that brings out the best in humanity. The Internet has enabled people around the world to receive and share information and ideas in ways never conceived of before. And while the Internet has the power to connect an increasingly globalized society, without careful and responsible development, the Internet can harm as much as it helps.

I have worked as a product manager at large tech companies since 2006, including Google, Pinterest, Yelp and Facebook. My job has largely focused on algorithmic products like Google+ Search and recommendation systems like the one that powers the Facebook News Feed. Working at four major tech companies that operate different types of social networks, I have been able to compare and contrast how each company approaches and deals with different challenges. The choices being made by Facebook’s leadership are a huge problem — for children, for public safety, for democracy — that is why I came forward. And let’s be clear: it doesn’t have to be this way. We are here today because of deliberate choices Facebook has made.

Statement of Frances Haugen October 4, 2021

I joined Facebook in 2019 because someone close to me was radicalized online. I felt compelled to take an active role in creating a better, less toxic Facebook. During my time at Facebook, first working as the lead product manager for Civic Misinformation and later on Counter-Espionage, I saw that Facebook repeatedly encountered conflicts between its own profits and our safety. Facebook consistently resolved those conflicts in favor of its own profits. The result has been a system that amplifies division, extremism, and polarization — and undermining societies around the world. In some cases, this dangerous online talk has led to actual violence that harms and even kills people. In other cases, their profit optimizing machine is generating self-harm and self-hate — especially for vulnerable groups, like teenage girls. These problems have been confirmed repeatedly by Facebook’s own internal research.

This is not simply a matter of some social media users being angry or unstable. Facebook became a $1 trillion company by paying for its profits with our safety, including the safety of our children. And that is unacceptable.

I believe what I did was right and necessary for the common good — but I know Facebook has infinite resources, which it could use to destroy me. I came forward because I recognized a frightening truth: almost no one outside of Facebook knows what happens inside Facebook. The company’s leadership keeps vital information from the public, the U.S. government, its shareholders, and governments around the world. The documents I have provided prove that Facebook has repeatedly misled us about what its own research reveals about the safety of children, its role in spreading hateful and polarizing messages, and so much more. I appreciate the seriousness with which Members of Congress and the Securities and Exchange Commission are approaching these issues.

The severity of this crisis demands that we break out of previous regulatory frames. Tweaks to outdated privacy protections or changes to Section 230 will not be sufficient. The core of the issue is that no one can understand Facebook’s destructive choices better than Facebook, because only Facebook gets to look under the hood. A critical starting point for effective regulation is transparency: full access to data for research not directed by Facebook. On this foundation, we can build sensible rules and standards to address consumer harms, illegal content, data protection, anticompetitive practices, algorithmic systems and more.

As long as Facebook is operating in the dark, it is accountable to no one. And it will continue to make choices that go against the common good. Our common good.

When we realized tobacco companies were hiding the harms it caused, the government took action. When we figured out cars were safer with seat belts, the government took action. And today, the government is taking action against companies that hid evidence on opioids.
I implore you to do the same here.

Right now, Facebook chooses what information billions of people see, shaping their perception of reality. Even those who don’t use Facebook are impacted by the radicalization of people who do. A company with control over our deepest thoughts, feelings and behaviors needs real oversight.

But Facebook’s closed design means it has no oversight — even from its own Oversight Board, which is as blind as the public. Only Facebook knows how it personalizes your feed for you. It hides behind walls that keep the eyes of researchers and regulators from understanding the true dynamics of the system. When the tobacco companies claimed that filtered cigarettes were safer for consumers, it was possible for scientists to independently invalidate that marketing message and confirm that in fact they posed a greater threat to human health.1 But today we can’t make this kind of independent assessment of Facebook. We have to just trust what Facebook says is true — and they have repeatedly proved that they do not deserve our blind faith.

This inability to see into the actual systems of Facebook and confirm that Facebook’s systems work like they say is like the Department of Transportation regulating cars by watching them drive down the highway. Imagine if no regulator could ride in a car, pump up its wheels, crash test a car, or even know that seat belts could exist. Facebook’s regulators can see some of the problems — but they are kept blind to what is causing them and thus can’t craft specific solutions. They cannot even access the company’s own data on product safety, much less conduct an independent audit. How is the public supposed to assess if Facebook is resolving conflicts of interest in a way that is aligned with the public good if it has no visibility and no context into how Facebook really operates? This must change.

Facebook wants you to believe that the problems we’re talking about are unsolvable. They want you to believe in false choices. They want you to believe you must choose between connecting with those you love online and your personal privacy. That in order to share fun photos of your kids with old friends, you must also be inundated with misinformation. They want you to believe that this is just part of the deal. I am here to tell you today that’s not true. These problems are solvable. A safer, more enjoyable social media is possible. But if there is one thing that I hope everyone takes away from these disclosures it is that Facebook chooses profit over safety every day — and without action, this will continue.

Congress can change the rules Facebook plays by and stop the harm it is causing.

I came forward, at great personal risk, because I believe we still have time to act. But we must act now.

Thank you.