Meta made cuts to election teams ahead of Threads launch, prompting concerns for 2024

CNN
People with direct knowledge told CNN that Meta has cut its teams responsible for combating disinformation, coordinated troll campaigns, and harassment on its platforms. This raises concerns in advance of the crucial 2024 US elections.
A person with knowledge of the situation said that several members of the team responsible for countering misinformation and disinformation during the US midterm elections in 2022 were laid off both last fall and spring. These staffers work as part of Meta's global team to combat disinformation campaigns that aim to undermine trust or create confusion in elections.
Meta, parent company of Facebook, Instagram and Twitter, announced the news as it celebrated the unprecedented success of its Threads platform. It has surpassed 100 million users in just five days since launch, and opened a new potential avenue for bad actors.
When asked how many staffers were cut from Meta's teams that work on elections, a spokesperson for Meta did not provide any details. The spokesperson told CNN that 'Protecting US 2024 elections remains one of our highest priorities and our integrity efforts are leading the industry'.
The spokesperson refused to answer CNN's questions regarding the additional resources deployed by Meta to monitor and moderate their new platform. Meta, on the other hand, said that since 2016, the social media giant has invested $16 billion into technology and teams to protect its users.
The decision to layoff staffers in advance of 2024 when elections will take place not only in the United States, but also in Taiwan and India, as well as in Ukraine, has caused concern among those who have direct knowledge of Meta’s election integrity work.
Even people within the company find it hard to estimate how many people work on Meta's election efforts due to the disparate nature. CNN reported that 'content review specialists' were among the employees who were most affected by the cuts. These individuals manually review election-related postings to ensure they do not violate Meta's Terms of Service.
Meta is working to counteract these cuts by proactively detecting accounts which spread false information about elections, according to the person who spoke under the condition of anonymity as they were not authorized journalists.
Since years, Facebook has been investing heavily in teams to track down sophisticated networks of fake accounts. Meta describes this as 'coordinated, inauthentic behavior' that began during the 2016 presidential election, when a Russian-linked troll campaign ran amok on Facebook.
The social media industry has generally regarded the team that is tasked with fighting influence campaigns, which includes former US intelligence and government officials, as being the most capable. In recent years, the company published quarterly reports that exposed governments and other entities who were found to be operating covert campaigns spreading disinformation on Meta’s platforms.
Another person with knowledge of the situation stated that teams looking into disinformation campaigns must now prioritize which campaigns to investigate and which countries to concentrate on. This could lead some deceptive campaigns to go unnoticed.
The person stressed that Meta has a team of professionals who are dedicated to these issues. Many of them are highly respected within the cyber- and information security community.
CNN reported that while automated systems and artificial intelligence can detect some of the efforts, uncovering sophisticated disinformation is still a "very manual process" that requires intense scrutiny by expert staff.
They said that they were afraid Meta would regress from the progress it made by learning from its past mistakes. They said that the lessons learned were costly, citing Meta's admission in 2018 that its platforms had been used to incite violence against Myanmar.
Meta, along with other social media platforms, rely on the expertise of academics and researchers who are experts in tracking covert disinformation networks.
Darren Linvill is a professor from Clemson's Media Forensics Hub. He said he sent Meta valuable tips over the past few months but that Meta's response has slowed down significantly.
Linvill has a track record for successfully identifying covert accounts online, and helped to uncover a Russian meddling attempt in Africa in 2020. He said Meta removed recently a network Russian language accounts posting both pro- and anti Ukraine content on Facebook.
He said that they were trying to incite anger on both sides.
Threads, launched last Thursday, has been a huge success, with celebrities, politicians and journalists flocking there.
Instead of being directly linked to Facebook, the new Twitter-style application is tied to existing Instagram accounts. Currently, Instagram and Threads share the same community standards, but they differ in terms of Meta's method to combat misinformation.
Meta labels state-controlled Facebook and Instagram accounts, including those of the Russian Sputnik news agency or China CCTV. These labels are not visible on state-controlled Threads accounts.
The launch of Threads even as Meta trims its disinformation-focused personnel comes at a turbulent and transformative time for those tasked with writing and implementing rules on social media platforms.
Elon Musk, a billionaire who purchased Twitter last year has essentially ripped up the rules of the platform and slashed the team responsible for implementing policies to combat disinformation.
YouTube announced last month that it will allow videos that falsely claim the 2020 US Presidential election has been stolen. This is a change from its previous policy.
The rule reversals come as the Republican-controlled House of Representatives investigates interactions between technology companies and the federal government.
A federal judge in Louisiana last week ordered top Biden administration officials and agencies not to communicate with social networking companies about certain content. This was a victory for GOP states who had accused the government of going to far in its efforts to combat Covid-19 misinformation.
Katie Harbath is a former Facebook employee who led the company's global elections efforts from 2010 to 2021.
Harbath stated that she could [almost] "hear" [Meta Global Affairs president] Nick Clegg say, "we are going to be careful of what we do because we don't want run afoul with the law."