Editorial: Facebook in the hot seat
In sizzling testimony before a Senate subcommittee on consumer protection this Tuesday, Facebook whistle-blower Frances Haugen, who worked on Facebook’s civic misinformation team for nearly two years until this past May, provided senators and the public with an inside look at how Facebook keeps its customers hooked to its feeds — even though it knowingly is causing personal and societal harm.
In the thousands of pages of documents that Haugen provided to the subcommittee about Facebook’s inside workings, what’s clear is how Facebook’s top executives have continually misled the public and investigators, and can not be trusted to tell the truth or to act in society’s best interest.
“I’m here today because I believe Facebook’s products harm children, stoke division and weaken our democracy,” Haugen said during her testimony. “The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people.”
As to what should be done, Haugen pushed for greater transparency of how Facebook’s algorithms can lead teens to pro-anorexia content and other damaging behavior, and boost extreme content and misinformation that is more likely to elicit a reaction from users, which, in turn, has led to the rapid rise in extreme vigilante groups throughout the country.
Specifically, she urged lawmakers to modify Section 230, which protects websites from liability for content posted by their users, and to make such platforms liable for some of the content “exempting decisions about algorithms.”
Facebook’s Mark Zuckerman pushed back this week, but his arguments were weak. He had the audacity to suggest that Facebook has not prioritized engagement to pad its bottom line, when even the most elementary look at social media, in general and Facebook in particular, recognizes that speech that sizzles attracts more interest than boring speech and that Facebook has honed the art of getting people hooked on the content it sends them to a science.
What should be expected of Facebook is simple: to use some of its immense profits to monitor the speech it allows on its platform in a socially responsible way, and to be liable for the speech it directs to consumers via its algorithms, just as radio, television and print publications are liable for the content they provide their consumers.
This is what community newspapers and other publishers throughout the country have long been expected to do. It’s why they remain one of the most trusted sources of news.
Would it be a huge lift for Facebook and other social media companies to meet similar expectations? Absolutely. Being a responsible entity in the media landscape — and make no mistake that Facebook is a media company with billions of readers and billions in advertising revenue — is an expensive proposition. It requires editors — hired employees not just programs of artificial intelligence — to prevent deliberate campaigns of misinformation and be liable for information it choses to publish. (Haugen noted that Facebook’s focus on technological tools to catch vaccine, claims of fraudulent voting and other misinformation is “overly reliant on artificial intelligence systems that they themselves say will likely never get more than 10 to 20 % of the content.”)
Will Congress go that far to reign in Facebook’s ills? We admit that is a stretch. It would go against earlier decisions by Congress decades ago that established internet companies (which no one then knew how they would develop) as public platforms — exempt from liability of content — rather than as publishers. We’re well aware of the promise the internet posed at the time: that a free flow of information would bolster democracies around the world and truth would reign supreme.
In retrospect, it seems a grossly naïve mistake. Had internet companies been defined as publishers from the get-go, we can only imagine the internet would have developed with higher content standards and corporate entities that would have prevented the flood of misinformation that today is eroding democracies around the world.
While such substantial change is unlikely, making Facebook and other social media companies responsible for some of the content is the least that can be done. But how likely is even that?
Again, here’s Haugen in her testimony:
“Facebook wants you to believe that the problems we’re talking about are unsolvable. They want you to believe in false choices. They want you to believe that you must choose between a Facebook full of divisive and extreme content or losing one of the most important values our country was founded upon: free speech….That to be able to share fun photos of your kids with old friends, you must also be inundated with anger-driven virility. They want you to believe that this is just part of the deal.
“I am here today to tell you that’s not true. These problems are solvable. A safer, free-speech-respecting, more enjoyable social media is possible,” she said, later adding: “We can afford nothing less than full transparency. Left alone, Facebook will continue to make choices that go against the common good.”
The good news is that while Facebook defended its record profits and its practices (Facebook net income grew 101% in the second quarter of 2021 compared to the prior year, posting a net income of $10.4 billion for the quarter), they also agreed that government regulation was needed.
“We agree on one thing,” Lena Pietsch, Facebook’s director of policy communications, said in a statement, “it’s time to begin to create standard rules for the internet. It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”
What’s also clear is that social media companies have no incentive to reform. It is against their economic interests. Yet, the mandate for change is obvious. For the sake of the nation’s democracy, Congress needs to act quickly and with enough muscle to stop the deliberate spread of misinformation, including enforcement parameters that will reign in bad actors.