Center for Countering Digital Hate
According to their website, the Center for Countering Digital Hate "counters hate and disinformation, by disrupting the online architecture enabling its rapid worldwide growth."
Background
Center for Countering Digital Hate was formerly known as "Brixton Endeavours Limited" incorporated on October 19, 2018, which spearheaded an "internet campaign named Stop Funding Fake News (SFFN)" that targeted advertisers of certain websites.[1] The initiative was launced by Morgan McSweeney, Keir Starmer’s chief of staff and "also the campaign manager for Liz Kendall’s leadership bid". Rachel Riley, "a prominent supporter of SFFN, is listed as a patron of the Center for Countering Digital Hate (CCDH), an organisation closely linked to SSFN."
Claims 'Hate Speech' on the Rise on Twitter
From Sheera Frenkel and Kate Conger of the New York Times on December 2, 2022:[2]
- "Before Elon Musk bought Twitter, slurs against Black Americans showed up on the social media service an average of 1,282 times a day. After the billionaire became Twitter’s owner, they jumped to 3,876 times a day.
- Slurs against gay men appeared on Twitter 2,506 times a day on average before Mr. Musk took over. Afterward, their use rose to 3,964 times a day.
- And antisemitic posts referring to Jews or Judaism soared more than 61 percent in the two weeks after Mr. Musk acquired the site.
- These findings — from the Center for Countering Digital Hate, the Anti-Defamation League and other groups that study online platforms — provide the most comprehensive picture to date of how conversations on Twitter have changed since Mr. Musk completed his $44 billion deal for the company in late October. While the numbers are relatively small, researchers said the increases were atypically high.
- The shift in speech is just the tip of a set of changes on the service under Mr. Musk. Accounts that Twitter used to regularly remove — such as those that identify as part of the Islamic State, which were banned after the U.S. government classified ISIS as a terror group — have come roaring back. Accounts associated with QAnon, a vast far-right conspiracy theory, have paid for and received verified status on Twitter, giving them a sheen of legitimacy.
- These changes are alarming, researchers said, adding that they had never seen such a sharp increase in hate speech, problematic content and formerly banned accounts in such a short period on a mainstream social media platform.
- “Elon Musk sent up the Bat Signal to every kind of racist, misogynist and homophobe that Twitter was open for business,” said Imran Ahmed, the chief executive of the Center for Countering Digital Hate. “They have reacted accordingly.”
- Mr. Musk, who did not respond to a request for comment, has been vocal about being a “free speech absolutist” who believes in unfettered discussions online. He has moved swiftly to overhaul Twitter’s practices, allowing former President Donald J. Trump — who was barred for tweets that could incite violence — to return. Last week, Mr. Musk proposed a widespread amnesty for accounts that Twitter’s previous leadership had suspended. And on Tuesday, he ended enforcement of a policy against Covid misinformation.
- But Mr. Musk has denied claims that hate speech has increased on Twitter under his watch. Last month, he tweeted a downward-trending graph that he said showed that “hate speech impressions” had dropped by a third since he took over. He did not provide underlying numbers or details of how he was measuring hate speech.
- On Thursday, Mr. Musk said the account of Kanye West, which was restricted for a spell in October because of an antisemitic tweet, would be suspended indefinitely after the rapper, known as Ye, tweeted an image of a swastika inside the Star of David. On Friday, Mr. Musk said Twitter would publish “hate speech impressions” every week and agreed with a tweet that said hate speech spiked last week because of Ye’s antisemitic posts.
- Changes in Twitter’s content not only have societal implications but also affect the company’s bottom line. Advertisers, which provide about 90 percent of Twitter’s revenue, have reduced their spending on the platform as they wait to see how it will fare under Mr. Musk. Some have said they are concerned that the quality of discussions on the platform will suffer.
- On Wednesday, Twitter sought to reassure advertisers about its commitment to online safety. “Brand safety is only possible when human safety is the top priority,” the company wrote in a blog post. “All of this remains true today.”
- The appeal to advertisers coincided with a meeting between Mr. Musk and Thierry Breton, the digital chief of the European Union, in which they discussed content moderation and regulation, according to an E.U. spokesman. Mr. Breton has pressed Mr. Musk to comply with the Digital Services Act, a European law that requires social platforms to reduce online harm or face fines and other penalties.
- Mr. Breton plans to visit Twitter's San Francisco headquarters early next year to perform a “stress test” of its ability to moderate content and combat disinformation, the spokesman said.
- On Twitter itself, researchers said the increase in hate speech, antisemitic posts and other troubling content had begun before Mr. Musk loosened the service’s content rules. That suggested that a further surge could be coming, they said.
- If that happens, it’s unclear whether Mr. Musk will have policies in place to deal with problematic speech or, even if he does, whether Twitter has the employees to keep up with moderation. Mr. Musk laid off, fired or accepted the resignations of more than half the company’s staff last month, including those who worked to remove harassment, foreign interference and disinformation from the service. Yoel Roth, Twitter’s head of trust of safety, was among those who quit.
- The Anti-Defamation League, which files regular reports of antisemitic tweets to Twitter and keeps track of which posts are removed, said the company had gone from taking action on 60 percent of the tweets it reported to only 30 percent.
- “We have advised Musk that Twitter should not just keep the policies it has had in place for years, it should dedicate resources to those policies,” said Yael Eisenstat, a vice president at the Anti-Defamation League, who met with Mr. Musk last month. She said he did not appear interested in taking the advice of civil rights groups and other organizations.
- “His actions to date show that he is not committed to a transparent process where he incorporates the best practices we have learned from civil society groups,” Ms. Eisenstat said. “Instead he has emboldened racists, homophobes and antisemites.”
- The lack of action extends to new accounts affiliated with terror groups and others that Twitter previously banned. In the first 12 days after Mr. Musk assumed control, 450 accounts associated with ISIS were created, up 69 percent from the previous 12 days, according to the Institute for Strategic Dialogue, a think tank that studies online platforms.
- Other social media companies are also increasingly concerned about how content is being moderated on Twitter.
- When Meta, which owns Facebook and Instagram, found accounts associated with Russian and Chinese state-backed influence campaigns on its platforms last month, it tried to alert Twitter, said two members of Meta’s security team, who asked not to be named because they were not authorized to speak publicly. The two companies often communicated on these issues, since foreign influence campaigns typically linked fake accounts on Facebook to Twitter.
- But this time was different. The emails to their counterparts at Twitter bounced or went unanswered, the Meta employees said, in a sign that those workers may have been fired.
"This Is Our Shot" Ally
Center for Countering Digital Hate is listed as an "Ally" in the This Is Our Shot initiative led by the California Medical Association (CMA) to convince people to take the coronavirus vaccine in response to the 2020 coronavirus pandemic.[3]
'Disinformation Dozen'
Center for Countering Digital Hate famously produced a report titled "The Disinformation Dozen" which claimed that "a small group of individuals who do not have relevant medical expertise and have their own pockets to line...are abusing social media platforms to misrepresent the threat of Covid and spread misinformation about the safety of vaccines."[4]
From the Center for Countering Digital Hate's website:[5]
- "Just twelve anti-vaxxers are responsible for almost two-thirds of anti‑vaccine content circulating on social media platforms. This new analysis of content posted or shared to social media over 812,000 times between February and March uncovers how a tiny group of determined anti-vaxxers is responsible for a tidal wave of disinformation—and shows how platforms can fix it by enforcing their standards."
The individuals highlighted in the report were targeted by the left-wing media in articles from Forbes,[6] The Guardian,[7] and the New York Times,[8] among others.
Leadership
From the Center for Countering Digital Hate website:[9]
- Imran Ahmed, Chief Executive Officer
- Imran Ahmed is the founder and CEO of the Center for Countering Digital Hate US/UK. He is an authority on social and psychological malignancies on social media, such as identity-based hate, extremism, disinformation, and conspiracy theories. He regularly appears on the media and in documentaries as an expert in how bad actors use digital spaces to harm others and benefit themselves, as well as how and why bad platforms allow them to do so. He advises politicians around the world on policy and legislation. Imran was inspired to start the Center after seeing the rise of antisemitism on the left in the United Kingdom and the murder of his colleague, Jo Cox MP, by a white supremacist, who had been radicalized in part online, during the EU Referendum in 2016. He holds an MA in Social and Political Sciences from the University of Cambridge. Imran lives in Washington DC, and tweets at @Imi_Ahmed.
- Sarah Eagan, Chief of Staff
- Sarah Eagan is the Chief of Staff for the Center for Countering Digital Hate. Prior to this role, she was responsible for developing the Center’s external affairs through its stakeholder, policy, and partnership program. Sarah previously served as a Press Secretary for NextGen America and has worked with organizations on developing their defenses against disinformation, conducting opposition research, and strengthening the research basis to support social movements. A Philadelphia native, she currently resides in London, UK and is employed by CCDH UK.
- Eva Hartshorn-Sanders, Head of Policy
- Eva Hartshorn-Sanders is the Head of Policy at the Center for Countering Digital Hate. Eva brings over 20 years’ experience working in NGOs, government agencies and the private sector to this role, with professional experience in the Asia-Pacific, US and European regions. As the Head of Policy, Eva is working on legislative reform, partnerships and political education. In her previous life, Eva was a key adviser on the New Zealand Government’s response to the Christchurch mosque terrorist attacks, including with New Zealand’s media regulator, supporting the second summit of the Christchurch Call, and reforming terrorism legislation. She has also worked with the UN on a climate change and space technology project in the Pacific, and as a senior legislative and political adviser in the UK House of Lords. Eva is based in Washington DC, and is employed by CCDH US.
- Callum Hood, Head of Research
- Callum Hood is Head of Research at the Center for Countering Digital Hate where he leads a global team investigating online hate and misinformation. He is responsible for the Center’s reports, including those exposing the deadly anti-vaxx industry and other malignant online actors, and how social media companies profit from disinformation on their platforms. Callum is based in the UK and employed by CCDH UK.
- Tom Lavelle, Head of Campaigns
- Tom Lavelle is the Head of Campaigns at the Center for Countering Digital Hate. His team manages the campaign, digital, press and events projects across the Center. Prior to CCDH Tom worked for a global education charity, the Office of Gordon and Sarah Brown, and spent almost 10 years working as a senior campaigns and digital professional in organizations advocating for change. He has advised advocacy movements around the world. Tom is based in the UK and employed by CCDH UK.
- Simon Clark, Chair of the Board. CCDH US & UK
- Simon Clark is a resident senior fellow at the Atlantic Council’s Digital Forensic Lab based in Washington, D.C. Simon chairs the Center for Countering Digital Hate US/UK. He is a director and former chair of Foreign Policy for America, the advocacy organization for principled American engagement in the world. He was a senior fellow at the Center for American Progress where he led their work on combating violent white supremacy that informed the White House’s Domestic Terrorism strategy.
References
- ↑ Exclusive: Labour right linked to campaign to shut down The Canary (Accessed December 2, 2022)
- ↑ Hate Speech’s Rise on Twitter Is Unprecedented, Researchers Find (Accessed December 2, 2022)
- ↑ This is our shot info (Accessed March 17 2021)
- ↑ The Disinformation Dozen (Accessed December 2, 2022)
- ↑ Why platforms must act on twelve leading online anti-vaxxers (Accessed December 2, 2022)
- ↑ De-platform The Disinformation Dozen (Accessed December 2, 2022)
- ↑ Majority of Covid misinformation came from 12 people, report finds (Accessed December 2, 2022)
- ↑ The Most Influential Spreader of Coronavirus Misinformation Online (Accessed December 2, 2022)
- ↑ Center for Countering Digital Hate Website (Accessed December 2, 2022)