That's simply not accurate. Facebook makes only a token effort to block Hezbollah and cartel recruiting and fundraising.
Our advocacy work indicates that Facebook only removes Hezbollah pages as a direct result of media attention, congressional hearings or direct intervention from concerned organizations. And once Facebook kicks off accounts and pages, it doesn’t appear to use sophisticated means to
prevent them from coming back.
In April 2018, for example, nine Hezbollah-related Facebook pages were removed after
Counter Extremism Project publicized links, including a tribute page to martyrs that had more than 60,000 followers. Within two weeks, a
replacement popped up.
Hezbollah also uses social media to promote fund-raising campaigns, meaning Facebook is effectively facilitating terror financing. In 2019, for example, Hezbollah’s Islamic Resistance Support Association
ran a crowdfunding campaign on Facebook to “Equip a Jihadi.”
The campaign urged sympathizers to give money or items of value in order to help Hezbollah fighters purchase necessary equipment, including boots, weapons and vests. Pro-Hezbollah accounts on Facebook circulated stories glorifying everyday people donating jewelry and other valuables to the campaign, and encouraging further donations.
Facebook executives have repeatedly asserted that their artificial capabilities are successful at taking down terror content. In an
October 2019 speech, for example, Facebook CEO Mark Zuckerberg said that, “Our AI systems identify 99 percent of the terrorist content we take down before anyone even sees it. This is a massive investment.”
The careful phrasing of this oft-repeated contention could be easily misconstrued to convey that Facebook’s AI tools remove 99 percent of overall terror content on its platform, an interpretation that would sound reassuring,
but would be false. Rather, Facebook discloses that its AI systems proactively identify 99 percent of all the terror content that
the company removes. Facebook
never identifies how much overall terror content it believes it removes, and it won’t disclose the number of user reports it receives about terror content.
In fact, earlier this year, ACCO researchers
filed a whistleblower complaint noting they identified that 33 percent of the 68 U.S. designated terror groups, or their leaders, operated official pages or groups on Facebook. CEP, meanwhile,
has questioned Facebook’s claim of removing 99 percent of terror content from its platforms.
This problem is not limited to terror networks either. ACCO researchers have tracked how
Mexican drug trafficking networks like Los Zetas, and the
multinational gang Mara Salvatrucha, also use Facebook to broadcast their propaganda, fundraise, recruit new members, extort and put out hits on individuals.