Facebook Silences the People Who Know Its Operations Best
TECHNOLOGY, 16 Aug 2021
Tech giants use nondisparagement clauses to keep former employees from discussing the companies.
3 Aug 2021 – Since the Jan. 6 insurrection, and with vaccine misinformation and disinformation still spreading as we try to climb our way out of the pandemic, social media companies have been under more public scrutiny than ever. The public — and the White House — wants answers, and the days when companies like Facebook could fully control the narrative seem to finally be ending.
But one thing the big tech firms still have in their corner is that they’ve made it difficult for employees, current and former, to find the safety and confidence to speak about issues affecting the public that they witnessed or participated in at their jobs.
It has been a thorny few weeks for Facebook’s public relations team. The release of “An Ugly Truth: Inside Facebook’s Battle for Domination,” a new book based on interviews with about 400 current and former employees — including me — came just before President Biden directly called out social media companies for not doing enough to combat vaccine misinformation.
Unfortunately, most of the people who know the most about the company’s inner workings have been willing to speak to the press only anonymously, for fear of retaliation or breach of nondisparagement agreements that are widely used in the tech world.
I worked at Facebook for about six months in 2018, hired as global head of elections integrity ops for political ads as the company attempted to dig out from its last big public relations crisis: the scandals around Cambridge Analytica’s improper use of the site’s data and Russia’s use of the platform to interfere in the 2016 presidential election. But Facebook stripped my title and changed my job description on my second day there. I was later sidelined for questioning why we were not fact-checking political ads and for trying to help ensure that we were not allowing voter suppression to occur through these ads. After I asked to move to a different part of the company where I would be empowered to do the job I was hired to do, I was fired.
When Facebook offered me a severance package on my way out the door, I refused it. Taking the money would have required me to sign a nondisparagement agreement written so broadly that I would have been barred, forever, from saying anything negative about the company, its products, any individuals who work there and even the terms of my employment.
Such agreements have become far too prevalent in our economy. Instead of being reserved for dispute resolution, they have become the status quo in separation agreements and sometimes even in initial employment arrangements. A handful of technology companies have unprecedented — and unchecked — power over our daily interactions and lives. Their ability to silence employees exacerbates that problem, depriving the public and regulators of a means to analyze actions that affect our public health, our public square and our democracy.
In response to the current criticisms, Facebook spokespeople have attacked those of us who have come forward to discuss them. They all but accused Biden of lying and called former employees who spoke out by name in “An Ugly Truth” disgruntled, saying we espoused “self-selected truths.” Facebook can use self-generated data points and “facts” that are unverifiable by the rest of us because its data, business decisions and practices remain mostly in a black box.
In response to Biden’s statement, the company did not provide any data on how anti-vaccine information is spreading or how many people have seen or engaged with it. Former Facebook vice president Brian Boland, who spent 11 years there, broke his silence on July 18 to call on the company to share more data with the public, saying on CNN: “I haven’t seen a focus or desire to be more transparent. . . . It’s one of the main reasons that I quit.” A New York Times piece the next day reported that Facebook data scientists — speaking anonymously — said they had proposed last year to study the prevalence of coronavirus misinformation on the platform but were rebuffed by senior leaders.
I imagine there are former Facebook employees who could help the public and lawmakers better understand the truth. Most may never have chosen to speak publicly anyway, but the likelihood that some are silenced by nondisparagement agreements is deeply troubling. And of course, while Facebook is effectively buying silence from its employees, it defends its decisions to allow misinformation and disinformation, and even some hate speech, on the grounds that it values free expression above all else.
These nondisparagement agreements, when used preemptively rather than in resolving disputes, do not serve any purpose other than to silence employees who might speak negatively about the company or their time there. They do not protect trade secrets; confidentiality agreements already do that. And in my experience, they are not mutual: Facebook’s nondisparagement agreement did not say the company would not disparage me.
(Contacted by an editor at The Washington Post, Facebook declined to comment.)
Since leaving Facebook in November 2018, I have voiced my opinions about social media’s effects on democracy and how to hold companies accountable, based in part on my experience at Facebook and on disagreements with its public stances, policies and business decisions since I left. I believe the company made intentional decisions that harmed our democracy. As someone who had been publicly vocal about that exact problem before I joined Facebook, and had spent 18 years working on national security and democracy issues, I refused to allow the company to stifle my free speech and ability to continue working fully in the field.
That was not a decision I took lightly. But had I signed the nondisparagement agreement, I would have been permanently barred from criticizing even future actions or statements by anyone at Facebook, and the company could have sought damages or an injunction if I violated it. These agreements are often put in front of soon-to-be-former employees at a vulnerable moment, when a loss of income plus the threat of negative comments to prospective employers can loom over their decision-making. Companies can use their asymmetric power over employees to pressure them to sign, often under short deadlines.
California may be moving to change some of Silicon Valley’s reliance on such agreements. In mid-August, the legislature is scheduled to vote on the Silenced No More Act, which includes protection from nondisparagement agreements in the case of workplace discrimination. Ifeoma Ozoma, whose former employer Pinterest tried to silence her after she filed a complaint about wage discrimination and retaliation, helped draft the bill. As she explained in a New York Times opinion piece, these restrictive clauses have an effect on the entire economy, including shareholders: “Unless individuals speak up, shareholders are often kept in the dark about misconduct at the companies in which they have a financial stake.”
Ideally, that legislation would set the groundwork for a conversation about whether nondisparagement agreements should be allowed at all, especially when they aren’t mutual, or at least regulated in a way that considers worker and public protections. Overly broad, indefinite agreements also raise anti-competitive concerns if they damage future employment prospects, an issue that the Biden administration signaled is a priority in a recent executive order on promoting competition.
With few avenues for transparency around their inner workings, these companies continue to assume we will take them at their word. A critical component of that transparency should come from people who have seen behind the curtain: employees. The battle between the White House and Facebook has shined a spotlight ever more brightly on a company trying to silence its employees, the problems of abusive nondisparagement practices and the challenge to hold companies accountable when they exert such power over former workers.
Yael Eisenstat is a future of democracy fellow at the Berggruen Institute and a former elections integrity head at Facebook, CIA officer and White House adviser.
Tags: Facebook, Social media
DISCLAIMER: The statements, views and opinions expressed in pieces republished here are solely those of the authors and do not necessarily represent those of TMS. In accordance with title 17 U.S.C. section 107, this material is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. TMS has no affiliation whatsoever with the originator of this article nor is TMS endorsed or sponsored by the originator. “GO TO ORIGINAL” links are provided as a convenience to our readers and allow for verification of authenticity. However, as originating pages are often updated by their originating host sites, the versions posted may not match the versions our readers view when clicking the “GO TO ORIGINAL” links. This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: http://www.law.cornell.edu/uscode/17/107.shtml. If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.