The day after Elon Musk closed his deal to buy Twitter, the company’s Seattle office held a Halloween party for employees and their children. Rebecca Scott Thein dressed in bright green to play an alien to her daughter’s Buzz Lightyear. Thein, whose job at Twitter (now X) was to help the platform plan for and navigate elections, was driving to the party when an urgent call came in. On the other end of the phone was a member of Twitter’s policy team. The company had just received a “consent decree”—essentially, a threat of legal action—in Brazil, which was about to hold runoffs for highly polarized presidential and gubernatorial elections.
An avowed free speech absolutist, Musk had already publicly announced that he would pare back content moderation—the systems and teams that Twitter had in place to deal with problematic material on its platform. The problem was, Twitter had already committed to doing something about the amount of election-related misinformation in Brazil. The Brazilian authorities wanted Twitter to stand by its promises. If it didn’t comply, the policy team member told her, the Brazilian authorities could fine the company or shut off the platform—which had more than 19 million users in the country. Something needed to be done, and quickly.
Thein recalls arriving at an office of listless employees—many playing foosball and lounging about, as there was no work to be done. Shortly after Musk took over, the company had locked down many of its internal systems to ensure no changes were made during the leadership handover (and coming layoffs). “Our active directory got shut off, all of our systems were shut off,” says Thein. She had no way of knowing which leaders still worked at the company or who to bring the alert to. “I got this call and I just thought, ‘Oh, no, What do I do? No one is online.’”
Thein ducked into a glass-walled conference room and, using what she knew of Twitter’s email conventions, began guessing at the contact details of the new leadership team. As parents and children arrived to a DJ and inflatable ghosts overlooking the Seattle skyline, Thein wondered who was even around to do anything about the Brazil problem.
What followed was a chaotic rush to try to plug gaps in Twitter’s processes and prevent the platform from becoming a vector of mis- and disinformation during a major election. To understand what happened, WIRED spoke with five people involved in managing the crisis.
Thein now worries that what she experienced in those early days of Musk’s leadership was less a fluke than a harbinger. A year later, Thein, as well as other former employees and experts, worry that X, gutted by layoffs and helmed by a leader hostile to moderation, is careening toward disaster in 2024. It’s a year in which more than 50 countries—including the US—will hold elections.
X did not respond to requests for comment.
By the time Elon Musk appeared at Twitter’s offices in San Francisco on the eve of his takeover, holding a sink, Thein had been at Twitter for a little over a year. The company had been burned during the 2020 US elections—when its ad hoc approach to dealing with then-president Donald Trump left it open to accusations that it had contributed to the January 6 insurrection at the US Capitol—so it was trying to build out the protocols for future elections. The Brazilian elections and the US midterms in 2022 were supposed to be a trial run for the US presidential election in 2024. Thein was on maternity leave with her second child when Musk made his offer to buy Twitter. When she returned to work at the end of July 2022, the effects of the purchase—which wouldn’t officially close until October—were already being felt. Thein put all of the names of people who “touched” elections work at Twitter in an Excel workbook. As they left over the next couple of months, she crossed them off, name after name.
These departures hampered Twitter’s plans to roll out the features it had committed to in Brazil, which included placing special badges by the names of the many candidates running for office across the country. The team had underestimated the scale of this task, Thein says. Reorganizations and resignations slowed their work too. And on September 5, 2022, extreme heat in California knocked one of Twitter’s data centers offline, temporarily making any major changes to the platforms nearly impossible. The features were supposed to be in place by the beginning of October. “Already in September, we were having to go to the electoral authorities saying ‘We can’t deliver what we said we would,’” Thein says.
As the Brazilian elections approached, teams at Twitter did “tabletop” scenarios, wargaming the various threats the platform could face and how it might respond. Thein was present at one of these, where they considered questions like, “What if there was a contested election?” “What if one of the candidates declared victory prematurely?” Thein and others say the person who coordinated these exercises also left before Musk took the helm.
Among the other scenarios that were floated for wargaming, according to people with knowledge of the process, was what would happen if there were a sudden change in the company’s leadership. What would the team do if the management of Twitter was itself the risk? That scenario wasn’t run.
The Brazilian election authorities, it seemed, saw risks in Twitter’s new leadership. According to several former employees who were aware of the consent decree and among those responding to it, the Brazilian election authorities expressed two primary concerns. The first was that Musk had declared that the company would scale back content moderation. This could have put the platform in violation of the country’s laws, should it fail to take down anything flagged for removal by the electoral court. Secondly, these employees were told that officials were worried because Twitter had only been able to roll out some of the labels for candidates, and they feared the paid-for Twitter Blue mark that Musk planned to launch could easily be used to create fake accounts. With thousands of people running for regional and national office across the country, they were concerned that anyone could falsely claim to be a candidate.
Musk’s open opposition to moderation meant that some Twitter staffers didn’t know how to respond to the Brazilians’ demands. Well before the billionaire closed the deal and declared “the bird is freed,” he had made his feelings clear. In April 2022, when he first offered to buy the company, Musk tweeted that he was against “censorship” that went beyond illegal speech. In June 2022, he described the platform’s moderation as biased against “half” the US. Before Musk took control of Twitter, but after he’d made his offer for the company, he tweeted a meme that implied that then-policy chief Vijaya Gadde was responsible for the company’s supposed left-wing bias, even though the company’s own research found that the platform favored right-wing content. Gadde was among the first group of executives fired after Musk took over the company. No one knew which employees or departments would be next.
Thein and her colleagues worried that the political risks might not convince Musk to let them back into Twitter’s systems to turn on the content moderation guardrails. So they came up with a new pitch around the soccer World Cup, which was due to kick off in Qatar in November.
“We thought, ‘This person is new to the company and does not understand how social media works … and may be unaware that there’s even a Brazilian runoff,’” Thein says. In soccer-obsessed Brazil, the World Cup was a massive commercial opportunity. If the platform was taken offline, it would be a huge hit to its revenue. “We reframed it as, Brazil is our third-largest market, and the World Cup is our second-largest ad revenue opportunity of the year,” Thein says. “If we get shut off in Brazil, there’s a fiduciary risk for you.”
As Thein watched the Halloween party outside the glass conference room walls, coworkers kept her updated about a meeting happening between senior employees and Musk, who they said had received the risk assessment. The World Cup framing worked, says Thein. Senior team members told her Musk was “super understanding.” With his blessing, Thein and her team began scrambling to right the ship.
Content moderation is a combination of automated and human decision-making, but there were specific automated moderation systems—bots—that Twitter had set up around the elections. For instance, some machine learning algorithms might flag certain words or terms or label a tweet as having election-related information (with links to official sources). But most of Twitter’s systems remained frozen, meaning someone would need to go in and turn on these particular algorithms. When Thein reached out to the employees who had the ability to do so, she was met with incredulity. “They’d say, ‘You’re telling me—all over the news [Musk] is saying ‘I’m not going to do content moderation’—you’re going to make me lose my job to give you access?’” she says.
In the end, Ella Irwin, then the vice president of product for trust and safety, was asked to create a note in the system indicating that Musk had approved certain content moderation for the Brazilian election. Twitter added labels to most of the candidates running and flagged tweets that were likely to contain election misinformation.
But that wasn’t the end of the crisis. The volume of mis- and disinformation flooding the platform in Brazil was more than the cobbled-together team could handle. Messages, seen by WIRED, show the team asking for more resources as the backlog of problematic content grew. According to the messages, the team was given just one additional person to help deal with the rapidly expanding queue.
Thein and her team managed to keep Twitter online during the Brazilian election. When it was over, they turned their attention to the next big challenge: the US midterm elections on November 8. She and her colleagues wanted the concessions made for Brazil to be extended to the US. Over the next few days, they battled to ready a plan. But on November 4, four days before the US midterm elections, Thein was one of 3,700 people, including many of the company’s trust and safety staffers, to be laid off.
Over the past year, researchers who track hate speech and mis- and disinformation have released report after report about the deterioration of moderation on the platform. The teams that are supposed to keep the platform safe have been systematically gutted. In November 2022, Musk cut some 4,400 contract workers, some of whom were tasked with content moderation. By April 2023, Musk said he had cut about 80 percent of the company’s staff. In September, the company announced that it was cutting half of its remaining trust and safety staff focused on elections.
Other policy changes haven’t helped. In April, the platform dropped the “state-affiliated media” label that had previously been added to propaganda outlets, including those from China and Russia. A recent study from disinformation research organization NewsGuard found that engagement for state-backed propaganda from Iran, Russia, and China rose substantially after the change. In October, NewsGuard found that nearly three-quarters of disinformation on X relating to the Israel-Hamas conflict came from verified “blue tick” users. The platform’s “community notes” feature that installed a form of crowdsourced moderation has at times added to the noise.
The Center for American Progress think tank estimates that some 2 billion people will vote in national elections in 2024. Most of those will take place outside the US and Europe, in countries like Brazil, where social media companies already struggle to moderate in local languages. In September, the Global Coalition for Tech Justice, a group of more than 150 civil society organizations, asked tech companies to share their action plans for the 2024 elections. Alexandra Pardal, who is campaign director at Digital Action, a nonprofit digital rights organization that helped lead the coalition, told WIRED that X was the only company to not respond at all to this request. “I don’t see any indications that they’re ready–or want to be ready–for next year’s mega-cycle of elections,” she says.
In addition to the trust and safety personnel layoffs, “they’ve rolled back basic features of the platform that helped deal with things that will obviously pose threats to the integrity of elections around the world,” says Jesse Lehrich, cofounder of the advocacy group Accountable Tech. This includes “removing labels on government officials and state-run media accounts, and obviously Twitter Blue has been a total disaster,” he says, noting that the platform’s new revenue-sharing model could add to the chaos. “There’s a financial incentive to post outrageous or sensational content to go viral and potentially profit from it.”
Musk has also begun charging for Twitter’s API, making it harder for those who monitor the platform to assess the true scope of its content moderation problems. X has even threatened to sue organizations that are trying to hold it to account.
When Thein left Twitter, she says she refused to sign a nondisclosure agreement in exchange for severance, knowing that she might one day want to talk about what she’d experienced and worried about the damage the platform could cause under Musk. Her fear is that flaws that preceded Musk’s acquisition will be amplified and pose huge risks for democracy. That’s why she has chosen to talk now. “If this is what takes me down, go for it,” she says. “There are worse things to be in trouble for.”